When I look at nature I see a glimpse of the infinite creativity of our Father. Not only did He create every species, down to the seemingly endless variations of trees, brush, mammals, and insects, but He has seen every distinct movement of every creeping thing, down to the blood pulsing through their veins and the fur on their backs swaying in the wind. He has seen the way every single photon cast off from the sun makes contact with the trees, diffusing through their leaves, and touching down on the ground for all of history ad infinitum.
As I gaze at the sky, He gazes back. He is the light that makes contact with the photoreceptor cells in my eyes to reveal the majesty of creation. And every sunset that splashes across the sky, He has painted thoughtfully on an ever-shifting canvas. Yet to me, every day, in its own way, the sun sets more beautifully than I could possibly imagine. And in my limitation, every day is a mystery.
]]>For modern man, travelling to another country typically offers no transcendent experience beyond drinking, eating, and taking the same photo of the Leaning Tower of Pisa that tens of millions of people have taken before you. This ritual is no different to me than if you just went to a local district themed after your culture of choice, and you ate at a restaurant where they spoke their native tongue to you, and then you step outside and stick your head through a photo stand-in and check off the photo in front of your attraction of choice. The only real difference is some abstract notion that you were in the correct geographic location, therefore you may now tick the country off your list. The article The age of average describes a creeping void of homogeneity using Airbnb and cafe interior design (see fig. 1 below) to illustrate how, all over the world, interior designers have unconsciously agreed upon a globally homogenous style-guide which affords well-to-do individuals the ability to travel to the other side of the world and see nothing new. Sure, you may go to a busy street market in Phnom Penh and see all sorts of people living drastically different lives to you, but at the end of the day you'll retire to a luxurious villa and forget all about the poverty surrounding you.
Figure 1: The age of average
Evola's Meditations on the Peaks captures this perfectly in the context of mountain climbing which had, in his view, been corrupted and trivialised as simply another vain pursuit of hedonism. "we cannot help but notice the presence among our young people, of love for risk and even of heroism. [...] mountain climbing, when experienced only in keeping with this view, would not be easily distinguished from the pursuit of emotions for their own sake". Evola continues, "This pursuit of radical sensations generates all kinds of extravagant and desperate feats and bold acrobatic activities [...] All things considered, these things do not differ very much from other excitements or drugs, the employment of which suggests the absence rather than the presence of a true sense of personality". To Evola, the spiritual majesty of the mountain from days of antiquity arose from their inaccessibility. Virtually all ancient civilisations situated around mountains saw the mountains as possessing some essence of immortality, as they conceived of the mountain as a separate plane of existence.
Technological advancements that make it easier to summit a mountain cheapen the experience. Now today, when virtually every mountain has been conquered, and we have helicopters and drone footage of the peaks, the mystery has been completely devoured by the machines of modernity. Consider the case of Tabletop Mountain in Toowoomba. In 2017 there was a proposal to build a cable car across from Picnic Point to the mountain. Tabletop Mountain is a tough but manageable climb for your average able-bodied person, but you still have to make a physical commitment to reach the top. Today, you can climb tabletop and find yourself completely alone, slightly closer to God, and able to look across the rolling hills at an ever-widening horizon. By making it accessible to everyone, you inevitably destroy what little mystique remains.
This brings us back to the "traveller" persona. I cannot blame, nor judge these people. After all, what spiritually transcendent activities can your average man really engage in today? The potential for enrichment has been sucked out of virtually every activity, and we are told there is no spiritual aspects to pursue within ourselves. You shouldn't have children because it's unaffordable, or because it's bad for the environment. You shouldn't pursue God, because that's for un-developed Neanderthals who haven't yet heard the gospel of Science. The only option presented to these people to achieve fulfilment is travelling, and it makes sense, because there is still a notion of triumph in travelling. You cross vast oceans in a matter of hours, whereas your ancestors - if they were even able to travel - would be crammed into the hull of a ship for weeks or months just to see one new country. Unfortunately for these people, when they make it overseas, they'll typically find many of the same trappings as they are accustomed to at home (see below for an image of an "English" street vandalised with American fast-food chains). In my own life, the last overseas travel I did was to Cambodia in 2018, and after I went to the Killing Fields and heard the harrowing tale, I returned to my hotel and on the same street was a Burger King and Cold Stone Creamery.
Figure 2: An "English" high street
The end result of all of this is the complete commoditisation of spirituality; a sort of drive-thru baptism where people are told that enrichment of the soul is yet another product to be consumed, rather than a lifelong pursuit within yourself. People are sold the idea that to be enriched, all you need to do is buy a ticket to see the sunrise at Angkor Wat and you'll be whole. Given enough time, I believe that even this level of spiritual enrichment is going to be made impossible for the average man (due to climate restrictions and the theft of their discretionary income by central banks) and travel may again be reserved for the social elite. Perhaps then our people will once again begin to look within.
]]>Those who are irresistibly attracted to the mountains have often only experienced in an emotion a greatness that is beyond their understanding. They have not learned to master a new inner state emerging from the deepest recesses of their beings. Thus, they do not know why they seek increasingly wider horizons, freer skies, tougher peaks; or why, from peak to peak, from wall to wall, and from danger to danger, through their experiences they have become inexplicably disillusioned with everything that, in their ordinary lives, appeared to them as most lively, important, and exciting
Two men went up to the temple to pray, one a Pharisee and the other a tax collector. The Pharisee stood by himself and prayed: ‘God, I thank you that I am not like other people—robbers, evildoers, adulterers—or even like this tax collector. I fast twice a week and give a tenth of all I get.’ But the tax collector stood at a distance. He would not even look up to heaven, but beat his breast and said, ‘God, have mercy on me, a sinner.’ I tell you that this man, rather than the other, went home justified before God. For all those who exalt themselves will be humbled, and those who humble themselves will be exalted.
Luke 18:9-14
You must humble yourself. For all those who exalt themselves will be humbled, but the one who humbles himself will be exalted. Death is the ultimate sign of spiritual rebirth, and to die in the name of God would surely be the greatest honour a Christian can attain.
For my money, the crucifixion is the greatest form of humbling one could experience. Not only are you brutally executed and displayed as a warning to would-be "law"breakers, those who purport to be your brothers and sisters will rally around in Lemming-like obedience and celebrate your execution. After all, your fate has been determined by our god, the State.
For the Son of God to willingly humble himself to this degree, in the face of utter betrayal from his people, in the face of false accusations, who are you to have any pride in your insufficient self or your own abilities? If the time ever comes that we should choose our faith or death, Lord please give me the strength to choose You in the face of death. Father, into your hands I commit my spirit.
Amen.
]]>In Figure 1 below, we can see Sweden's annual death count. The average deaths between 2011 and 2019 was 90,675 and the total deaths in 2020 was 97,941. This represents an 8.01% increase in total deaths, of which 9,771 (or 9.98% of total deaths). Assuming those who died from COVID-19 wouldn't have died from other causes* Sweden would have had 2.76% fewer deaths than the average. This could be explained by the voluntary behavioural changes that people make in the absence of government dictates. Sweden now
* which is an unlikely scenario given that COVID-19 rarely kills anyone in the absence of serious co-morbidities. In fact, the NHS reports that out of 130,624 deaths, only 3,656 deaths (2.80% of deaths) occurred where a pre-existing condition was not recorded (see appendix 1 for source).
Figure 1: Sweden death rate
Sweden now has a greater than 50% vaccination rate and they have experienced 28 deaths between July and mid-August.
As I mentioned in my post On mandatory COVID-19 vaccinations, the COVID-19 deaths are only one factor when it comes to measuring the impact of the pandemic (and the government response).
Sweden's inflation rate peaked in April 2021 at 2.2% YoY (Figure 2) at which point they reversed the trend and reported 1.3% YoY inflation in Jun 2021. By contrast, the United State's has "peaked" in Jun 2021 at a massive 4.5% YoY but we can't say this has peaked since they are still trending upwards. This can be partly explained by the US policy of closing businesses and paying workers to stay home with their enhanced unemployment benefits. I couldn't find any statistics on Sweden's business closures, but in the United States, Yelp data showed that 60% of business closures became permanent in 2020 and they closed out the year with 200,000 extra permanent business closures
Figure 2: Sweden inflation rate
Figure 3: United States inflation rate
Appendix 1: NHS Freedom of Information request regarding COVID deaths
]]>With COVID-19, early signals from China were very worrying indeed. By January 2020 we saw footage of people collapsing in the streets, people being barricaded in their houses, dragged out of their cars at checkpoints, and trucks patrolling the streets spraying chemicals in the air. This overt propaganda (that included sock puppet accounts spreading pro-lockdown messaging on social media) was after the covert attempt to suppress information about COVID by imprisoning doctors and falsely claiming that human to human transmission was impossible.
I've heard many theories about the motivation behind this disinformation campaign, the most compelling of which being that China knew how serious this virus would be and they wanted to allow it to spread to the rest of the world so they didn't suffer alone. My theory is much simpler and it's based on a pattern we've seen many times in history. The nature of pathological totalitarian governments such as China or the USSR means very little tolerance for mistakes. As we saw in 1986 after the meltdown of the nuclear reactor in Chernobyl, the authorities engaged in a multi-year coverup to deflect blame. Similarly, during the H1N1 outbreak, local officials suppressed information about the outbreak and didn't warn the public, fearing reprisal from Beijing.
So, how deadly is SARS-Cov-2 as compared to influenza? According to the Johns Hopkins COVID dashboard the global death rate (197,872,410 cases; 4,217,383 deaths) is 2.13%. The death rate fluctuates considerably between countries with the USA having a rate of 1.73%, France at 1.81%, and Australia at 2.69%. Strangely, India (where we saw many reports of people dying in the streets when the delta variant started spreading) has a far lower death rate (1.34%) than other comparably richer countries. The death rate in the USA is 1.73%, or 13.3 times higher than the influenza death rate, so, clearly this is a serious illness and we should not be flippant about it, but the question remains: have we overreacted?
The conversation about "15 days to slow the spread" started in mid-March 2020 in Australia and I agreed with the approach at the time. We didn't know how deadly the virus would be and needed to gather data to steer public health policy. For those of you with stone slabs for ceilings, the purpose of the 15 Days Doctrine (also; "flatten the curve") was to stop the initial wave of infections from overwhelming hospitals. Notably, the purpose was not to stop infections entirely. With 1.5 years of hindsight, we now know that (in the United States), 95.2% of deaths are in the 50-up age group and 79.2% of deaths are in the 65 and up age group (see figure 1 below). Deaths in the 30-39 age group represent 0.2% of cases which drops off dramatically for the 18-29 range (at 0.007% of cases) and when it comes to school-aged children (0-17 years old) the percentage of overall cases is 0.001% or 1 in 1000. The people least affected by this illness have paid the highest price as a result of the hysteria. They have to shoulder the mental burden of isolation during their most important developmental years, potentially being locked up with abusive parents (when school may have been their only respite from physical or verbal abuse), and online learning programs from a school system that is already a complete failure when it comes to academic outcomes. They will also have to shoulder the financial burden of the money printing required to prop up the stock market and pay off employees whose businesses were shut down by lockdown measures.
Figure 1. Deaths in the United States by age group as a proportion of deaths and of overall cases Data: cdc
Age group | Number of deaths | % of deaths | % of cases |
---|---|---|---|
0-17 | 340 | 0.06% | 0.001% |
18-29 | 2,493 | 0.41% | 0.007% |
30-39 | 7,145 | 1.18% | 0.020% |
40-49 | 14,976 | 3.14% | 0.054% |
50-64 | 96,318 | 15.96% | 0.275% |
65-74 | 134,601 | 22.30% | 0.385% |
75-84 | 165,059 | 27.35% | 0.472% |
85-up | 178,572 | 29.59% | 0.511% |
As I noted earlier, 87% of H1N1 deaths occurred in the under 65 age group which is an almost perfect inverse of the pattern in COVID deaths. Knowing this, we could have designed a program to minimize the impact on the young while protecting the elderly and severely sick, but we didn't. It's like having a fire on your stove, and instead of reaching for a fire extinguisher you get in a plane and dump 12,000 liters of water on your house from overhead. With a common sense policy allowing freedom for those who are not at risk, the cost/benefit analysis is left up to individuals armed with the best information. If you are not in an at-risk group but you don't feel safe then you are perfectly welcome to wear masks, work from home, order all your groceries online, etc.
The fundamental mistake that people make when advocating for universal lockdowns is a lack of consideration for the unseen consequences. This idea is best illustrated by Frederic Bastiat in his essay "That which is seen, and that which is not seen", where he lays out the Broken Window Fallacy. If someone throws a rock through a shopkeeper's window, people will often attempt to placate the shopkeeper by saying "at least you're giving work to a glazier who will use that to buy goods and enrich the community", however, what they fail to recognize is that the money used to replace the window could have been used to buy a new shirt. Whereas without the broken window the shopkeeper would have had a window and a new shirt, now he only has a window and the community is deprived of a shirt it could have afforded otherwise. According to a report by Douglas Allen, the lockdowns in Canada may have saved a cumulative 22,333 years of life as a result of the reduced COVID infections. The other side of the coin is the up to 6,300,000 years of lost life from:
As Douglas Allen says "the cost/benefit ratio of lockdowns in Canada, in terms of life-years saved, is between 3.6–282. That is, it is possible that lockdown will go down as one of the greatest peacetime policy failures in Canada’s history".
Indeed, in October 2020, more Japanese people committed suicide in that month alone than in the 10 months of COVID infections until that point. A survey by Grossman et al found that 60% of respondents reported increased drinking due to stress and boredom, and in Australia a survey revealed that nearly 10% of women have experienced domestic violence during the pandemic with 50% saying the abuse had become more frequent or severe since the pandemic began.
As if lockdowns weren't contentious enough, we turn to the topic of vaccines. My typical conversation about COVID vaccines follows this script:
"Are you going to get the vaccine?"
"No, I don't think so, I'm not at risk"
"Oh my God, you anti-vaxxer conspiracy theorist. You're killing grandma!"
The false dichotomy has been set up such that you can only be pro-COVID-vaccine or anti-all-vaccines. I won't virtue signal about vaccine support since that's become the new "I can't be racist, I have black friends", but I will say I'm happy to not be living in an iron lung. I am neither anti-vax, nor anti-COVID-vaccine. I'm pro-COVID-vaccine for those who are at risk and those who voluntarily choose to get it.
A typical talking point of the fervently pro-COVID-vaxxers is that they are doing it out of a sense of duty to the collective to reach herd immunity. This notion falsely assumes that everyone being vaccinated necessarily means the pandemic is over. A recent CDC study found that 74% of infections in a Massachusetts outbreak were among the fully vaccinated. Are you surprised? You shouldn't be - breakthrough infections were expected per the clinical data in trials. The number of breakthrough infections is almost certainly being understated since the bulk of these breakthrough infections are asymptomatic, and therefore, rarely cause alarm for individuals to get tested.
In Israel (which has one of the highest vaccination rates in the world) half of the cases in a recent outbreak were among the fully vaccinated, and researchers estimate that the Pfizer vaccine is now just 39% effective in preventing infections which has spurred talks about a third booster shot. This could be due to the spread of new variants (due to the narrow scope of the current generation of COVID vaccines) or due to a reduction in antibody levels among the vaccinated.
So, it has a marginal effect in preventing infections, but it's actually quite good at preventing severe symptoms. Armed with the data, we can conclude that an ideal goal would be "hybrid immunity" where at-risk individuals get the vaccine and the remainder of people are allowed to decide whether to be vaccinated based on their individual risk appetite. This would allow us to begin resuming normal life with no risk of overwhelming hospitals and with few deaths. Consider Figure 3 (below) which charts the death and hospitalization risk against age (with and without comorbidities) amongst the unvaccinated. As a 30 year old with no comorbid conditions, your risk of hospitalization (i.e. very severe symptoms) is a mere 2.7%. When you include those with asthma the percentage rises to 5.1%. Including obesity, the number climbs to 12.9%. Indeed, apart from age, obesity is the single biggest risk factor for COVID-19 (which makes the public health policy decision to lock people inside their homes and close gyms all the more absurd). In fact, lung cancer patients have better odds of survival than the obese.
Figure 3. Risk of hospitalization and death by age
The singular driving force of all living creatures (and thus, their evolution) is this: "Accruing the most resources with the least effort".
Pfizer made US$7.8bn in Q2 of 2021 alone and is forecasting sales of US$33.5bn for 2021. In Figure 4 below you'll note that for the US market, Pfizer makes $19.5 per vaccine. If the US government was to mandate vaccines for everyone and they placed their orders exclusively with Pfizer, that would mean approximately $13bn in the US market alone. You don't have to be an economics expert to see that the incentives are aligned to get as many people vaccinated as possible, and given the growing calls for booster shots, the gravy train keeps rolling for these pharmaceutical companies. It doesn't matter to the pharmaceutical companies whether people need the vaccine or not, their singular focus is on generating profits based on vaccines sold. This doesn't make Pfizer immoral (perhaps amoral), - they're simply responding to the incentives provided by the government.
Figure 4. Cost per shot for COVID vaccines
As a self-professed capitalist, I have no issue with companies making money, including pharmaceutical companies. The cost of the regulatory burden for developing new drugs, and the fact that close to 90% of all drugs are rejected by the FDA, the companies have to recoup costs not just for the development of the drugs that make it to market, but also the drugs that don't. When profits arise in a free market they are the result of voluntary interactions between the company and the individual. The individual decides that the product is worth more to them than their money. The nature of mandatory vaccination means that the profits are derived not from voluntary interactions, but by coercion at gunpoint. I've heard several people falsely claim that the vaccine is free, and to those people I say: there ain't no such thing as a free lunch. The vaccine is being paid for one of two ways: taxation or inflation.
In a free market the decision making would be completely decentralized. Health insurance companies would analyze the risk to them for each individual to get COVID (accounting for lifestyle factors such as obesity, cigarette smoking, etc.) to determine the cost to them with and without the vaccinations. They would also look at the clinical data for each vaccination and determine the risk of long-term side-effects as a result of vaccinations, and if they determine that it would be cheaper/less risky for them if people were vaccinated, they would add a premium loading for those who refuse to get the vaccination. If you are at risk and you refuse the vaccine, then you may have to pay 20-50% higher premiums because the insurance companies know you're more likely to need expensive medical care as a result of infections.
Trust the science Galileo, we all know the universe revolves around the Earth.
A cursory examination of the history of philosophy and science tells a dramatic tale where no good deed goes unpunished and many of the things which we now take for granted are paid for by the blood of those who refuse to conform to consensus. Socrates was sentenced to death for "corrupting the youth" and failing to acknowledge the gods of the city. Similarly, at the time of his death, Galileo was completely blind after 9 years of house arrest for the heretical pronouncement that the Earth revolves around the sun... Ironically, in both of these examples the offending party was persecuted for going against the religious narrative. Today, in our very secular, (allegedly) rational society, a new religion has been born. You run counter to the prevailing pop-sci narrative at your own peril as The Templar of the "I Fucking Love Science" Facebook page will shout you down and ostracize you from society. In the words of the immortal Anthony Fauci, the great prophet of the cult of Scientism, "attacks on me are attacks on science".
From a pragmatic standpoint, the percentage of the public who have received COVID vaccinations is entirely arbitrary. The only thing that matters is; are the vulnerable vaccinated? If you are 70 years old, you should probably get vaccinated. If you are a 30 year old with no comorbidities you don't need to get vaccinated, and indeed you probably shouldn't, but ultimately it is YOUR choice.
For the people on the political left who used to be skeptical of government overreach, the inherent self-contradiction of advocating for mandatory vaccines is particularly offensive. If you concede that there are things the government can force you to do (at gunpoint) for your health, then you have to concede that there are things that the government can force you NOT to do (at gunpoint) for your health. Like have an abortion, or get gender reassignment surgery, for example. By giving up bodily autonomy you open the door for all forms of medical tyranny.
The statist doctrine of mandatory vaccination can only possibly lead to the fractionation of society into castes where rights are deprived from those who don't kowtow to the dictates of the state. This obviously violates the 800 year history of Man's natural rights enshrined in law since the Magna Carta was written. In every medical procedure, the doctor is required to explain the risks, and the procedure only goes forward if the individual (or their kin) gives informed consent. If a treatment will fix your eyes but you'll lose a leg in the process we all agree that it's vital for the patient to understand the implications of the treatment, and decide that the consequences of not getting the treatment are greater than the consequences of getting the treatment. You cannot express informed consent for a procedure which you're being forced to undergo any more than you can call kidnapping "marriage", or rape "lovemaking". You cannot take what can only be given voluntarily.
Luke Boyle BAppSc, major in microbiology, since credentials are required to have an opinion now.
]]>This primer on Mencken's philosophy was quite profound, and, sadly, very underappreciated. I feel that this is an important perspective to re-frame the debate around statism versus a free society. The author shows that Mencken is not really a cynic, but a realist. As Mencken said, "Reconciling ourselves to the incurable swinishness of government, and to the inevitable stupidity and roguery of its agents, we discover that both stupidity and roguery are bearable - nay, that there is in them a certain assurance against something worse."
Indeed, his writings didn't bring about a free society - in fact, he correctly predicted that government would continue to grow at an exponential rate after his death. Advocating for the abolition of the state (or even the greater utopian vision of a limited state) is like trying to steer a cruise ship with an oar. So, how did Mencken work for a lifetime and still carry on with relative happiness? He didn't write to persuade. The author notes, "Writing to persuade can leave you with many peculiar stances. But writing to express your libertarian beliefs is a much more straightforward enterprise, and your writing is then relevant forever and won't come back to haunt you".
This makes me think of the modern Conservative whose current platform generally resembles the progressive platform of yesterday. It's an eternal game of rugby where the progressives charge ahead, and the conservatives celebrate a successful tackle without noticing they've ceded ground. When the progressives say, "we want $3 trillion in equitable infrastructure spending", if your response is to say "Let's compromise. How about $1.5 trillion?", you have already lost the debate. You tacitly admit that some government spending is good. If some government spending is good, then you obviously can't have too much of a good thing, so why stop at $1.5tn?. You are attempting to persuade the progressive to your position that government spending is evil by agreeing to government spending. Instead, you should argue from the principle that all government spending is necessarily funded by theft at gunpoint and therefore any concession is unconscionable.
The book has shown me that I have been far too utopian in discussions about free societies. Rather than listing all the ways that a free society will be better for the individuals within it - given that this is entirely subjective, (and many people find a great deal of comfort in being subordinate to the coercive monopoly of the state) - it is far more productive to argue from first principles. You may not be liked, but you will be authentic, and that is far more important in the long term. No amount of concession from you will make a free society any more likely. You'll either be hated for adhering to your principles, or you'll be forgotten because you abandoned them.
]]>"The fraud of democracy, I contend, is more amusing than any other... All its axioms resolve themselves into thundering paradoxes, many amounting to downright contradictions in terms. The mob is competent to rule the rest of us - but it must be rigorously policed itself. There is a government, not of men, but of laws - but men are set upon benches to decide finally what the law is and may be [...] I confess, for my part, that it greatly delights me. I enjoy democracy immensely. It is incomparably idiotic, and hence incomparably amusing."
H. L. Mencken
The book is very entertaining, and it's a very easy read (took me probably 5 hours, and I'm a particularly slow reader). I went through with a highlighter and emphasised the key points, but I found that as I got to the middle of the book, the insights started to dry up. As I got to Chapter 8 (A rebublic is born), it started to drag. I suspect - though I may be wrong - that this is approximately where the original allegory of Able, Baker, and Charlie growing the economy ended, and where the younger Schiff's original portion began. It was still entertaining, but unlike the start which was packed with easy to understand explanations for economic principles, the middle part leading up to the housing crisis was mostly a rushed re-telling of history with a healthy helping of fish puns.
When the authors got past the historical portion and into the future, the book did read better, and it ended very strongly. As this book came out in 2010, the authors envisioned a future where America had to face the music, and Obama took responsibility for his economic policy blunders. Unfortunately, with hindsight, we know that sort of happy ending is rare in politics. Obama and the Fed's policies became worse, which Trump then inherited, and continued. Ten years after publish, the crash described in the book hasn't arrived as the authors expected, but given the state of the American economy it seems more likely by the year.
Here's some key takeaways from the book:
(After increasing the productive capacity of the island; that is, catching more fish) "This didn't happen because the three guys were unsatisfied with their limited lifestyle. Their hunger, which is labeled "demand" in economic terms, was necessary to spur economic growth but not sufficient to achieve it."
"With their extra fish, the islanders can finally eat more than one fish per day. But the economy didn't grow because they consumed more. They consumed more because the economy grew."
(About Able giving a loan to someone to take a holiday) "Not only would such a transaction put his savings at unnecessary risk, but it would mean that the capital would be unavailable for more productive loans."
"In actuality, loans to consumers that do not fundamentally improve productive capacity are a burden to both the lenders and the borrowers."
"Steadily dropping prices also encourage savings as islanders begin to understand that their fish would likely buy more goods in the future than they do in the present."
Keynesians react to falling prices like a vampire reacts to a crucifix. Such a reaction is understandable when you realise that their theories are predicated on the idea that spending (i.e. consumption) equates to economic growth. This is why their primary course of action when faced with an economic contraction is monetary stimulus. Inflation is the best way to ensure people spend what they make, because if people know prices are going to rise, they are more likely to spend their money on goods they'll need in the future.
I'd suggest this book for people with a cursory interest in economics but without much of a background. It's quite easy to grasp and would be good for young high school students.
I give it a 6/10.
]]>So, why are these low wholesale prices not being reflected in the ACT market? It is due to market interference by the ACT government. Whereas other suppliers would change the retail price per kWh to track the wholesale spot market, the legislators are forcing EvoEnergy to abide by long-term fixed-price contracts negotiated by the government. You can almost guarantee this type of deal would not happen in the free market, as most successful companies understand risk forecasting very well. In times of high energy costs (such as in early 2019), this may have been beneficial, however, now that we're at relative lows, EvoEnergy must pay excess to cover the shortfall. EvoEnergy expects payments under these fixed-cost contracts to more than triple from $42 million this financial year to $127 million in FY2021-22. The primary reason for the government interference in the energy sector is the ACT's push for 100% renewable energy (which they purportedly achieved in late 2020). Typically, this claim of 100% renewable energy is sleight of hand, as the electricity supply that powers the ACT is still non-renewable, they simply offset their energy usage using solar and wind farms. It's fortunate they don't depend on these particular renewable sources of power as their primary grid.
California proved during their 2020 summer season that solar and wind don't scale well to high demand. Their rush to shut down their nuclear and coal power plants resulted in a vulnerable grid. Solar and wind in particular are vulnerable because they have to be in ideal conditions to effectively produce power. As the sun went down, on those hot summer nights, Californians turned on their air conditioners only to find that their solar network doesn't work particularly well without a shining sun! To compensate for the shortfall in production, they had to increase the output of their coal power plants. A resilient power network surely can't be one that falls apart on the first overcast, still day. China understands this problem well. They are planning to build six to eight new nuclear reactors per year between 2020 and 2025 (which would make them the largest producer of nuclear energy by 2022, surpassing America and France).
Back to the ACT, where the renewables are a secondary part of the grid, offsetting the emissions of the primary (non-renewable) part of the grid. A fully renewable, reliable grid is not achievable without nuclear (with today's technology, that is), so the ACT has to double-dip to give the appearance of a fully renewable grid. What this means for consumers is that they have to pay for the non-renewable AND the renewable portion of the electricity.
So, what is the government doing to remediate the issue they've caused? Will they allow the company to negotiate their pricing contracts themselves? Or perhaps walk back their 100% renewables requirement? Or perhaps they'll even invest in nuclear? If you answered "none of the above", you'd be correct. Instead, the government will increase electricity concessions by $50 (up to $750) annually this financial year, and a further $50 (up to $800) next financial year. The concession represents a $24.8 million dollar annual expense for the government, in addition to the one-off $1 million investment going to the "utilities hardship fund". That's like pushing someone overboard and then applauding yourself when you throw them a life jacket. You still pushed the poor bastard in the water but at least the TV crew was there to watch you rescue him.
It's the typical cycle of well-intentioned legislation:
The end result is always the same. The biggest losers are the low-income consumers who now have to spend more of their income for the same service. The government spends more (likely via deficit spending), and they likely makes less from taxation. You may imagine that the company is doing well, after all, they've raised their prices, therefore they must be making more money. That is not the case. The best way to make money as a business is to make your goods and services more affordable, and therefore allow more people to buy. As a result of these price increases, people will change their behaviour to use less power. They'll put on an extra pair of socks, and drink their tea a little hotter this winter. Perhaps they'll install a wood stove to heat their home without using electricity. The government doesn't even benefit, apart from their PR victory in this instance, everyone is worse off when they have to pay for the moral hazard of well-intentioned legislation.
]]>When priced in gold, you can see that over the last hundred years (see the graph below), the US Dollar has lost all of its value. The graph below plots the price of gold per ounce against the USD from 1915 to present. In 1915 the price of an ounce of gold was $19.25. When the dollar was created in 1792, the cost of gold was $18.60 per ounce, which we will refer to as the baseline value of the dollar. Between 1792 and 1915 (123 years) the price of gold only increased by 65 cents (roughly half a cent per year in inflation or an average of 0.028% p.a.). During this period, wages were relatively flat, however, America also became heavily industrialised, and the cost of living reduced by half. So, not only did the value of your money remain flat, you were able to buy more goods.
You'll note that the price of gold was flat until ~1932 when the government decided that it needed to devalue the currency, so it could create more dollars (since they were on a gold standard, they could only have as many dollars as you had gold reserves), so the legislature re-defined the value of an ounce of gold to be $35. Back then, it was a simple change of definition, since the dollar was still tied to, and redeemable in gold and silver. You'll notice that all hell breaks loose in 1971 when America left the gold standard (duping the entire world into accepting fake money tied to no real world value for their exports), and people were no longer allowed to redeem gold for their dollars (including foreign investors). Between 1971 and today, the price of gold rose from $36.56 to $1715.24 (as of March 2021). That is a face-melting 4591.56% inflation rate in 50 years, or an average of 91.8% per year.
The graph begins shortly after 1913 when the Federal Reserve Act was passed (creating the Federal Reserve), and the Sixteenth Amendment was ratified (allowing for the government to tax income). The Federal Reserve was intended to be an apolitical, non-government organisation to allow the creation of money to be separate from the legislature. That seems ridiculous in hindsight today, as the Fed and the legislature are just two sides of the same spending-addicted coin, hell-bent on debasing the currency no matter the cost. This isn't anything new; the Fed has been monetising the government's debt since the inception, but I think the wheels came off when legislature made the decision to leave the gold standard to fund their war machine abroad. The few checks and balances that previously existed evaporated. Before the legislature could vote to devalue the dollar, allowing the Fed to print more money, but after leaving the gold standard, the Fed now digitally prints tens of billions of dollars per month to buy treasury bonds to fuel the stock market bubble. Similarly, the income tax was another ill-fated policy. Originally, the income tax rate was 1% for people earning $0 to $20,000 (which is ~$529,911 today, according to the CPI, or $1,782,067 when priced in gold) with a top nominal tax rate of 7%. Clearly, in hindsight, these low rates were the camel's nose under the tent, and this was the government laying the groundwork for massive tax hikes during World War I (yet another war America didn't need to be involved in). By 1918, the top nominal tax rate was a whopping 77%.
All of that background is just to highlight the state of decay America is in which continues to accelerate as government grows. The reason I like to view inflation through the lens of gold price is because gold has been used as money for thousands of years, which is a much more meaningful time-scale than the ~240 years the USD has been in existence. Interestingly, when you price the Dow Jones index in dollars, it's at a record high of $32,981.55 (compared to the $1457.37 in 1915), which is an increase of 2163%, however, when you price it in ounces of gold, today it's at 19.23Oz (as compared to the 2.86Oz in 1915), that's an increase of 572.4%. A far cry from the >2,000% when measured in dollars.
Now that you're caught up on the historical horrors of the US dollar, we can talk about the present horrors. 40% of all dollars in existence were printed in 2020, and already in 2021 there has been nearly half a trillion dollars added to the national debt. The CPI is the measure of inflation we typically use (which is actively manipulated to understate the true increase in goods prices), and in March 2021 alone, it measured a 2.6% increase. It is no longer a conversation of "massive inflation is coming". Massive inflation is HERE! Look at the price of lumber for example (see below), which has had a 47% increase so far this year, after a 125% increase last year.
It should be clear from the preceding rant that I believe gold is the best way to hedge yourself against inflation, especially since, unlike wheat and oil, it never decays. Remember, it's not that gold is getting more expensive, it's that the dollar is getting weaker. So, trading precious metals for commodities to hedge yourself against currency fluctuations is the best course of action. What should you do if you don't have the available capital to buy gold? Firstly, you can buy much larger quantities of silver for far lower prices than you can buy gold, and it's more viable for everyday exchanges because it's more divisible.
Assuming you can't buy silver either, you should not let your money wither away at 0% interest in the bank or risk losing it on overpriced stocks (I'm not advising you pull your retirement funds out of the market, but you could consider mixing in some inflation hedges like gold and gold mining stocks). The best thing you can do locally is stock up on goods that you know you will need down the road. In 2017 Mark Cuban (Newly converted Bitcoin bull) said that people struggling to get ahead should buy in bulk and on sale. In hindsight, this was fairly prophetic considering where we are today. Many goods on the shelves are experiencing unprecedented price surges, and that doesn't even address the real possibility of serious goods shortages in the near future. Start working out how much of each non-perishable good you use per month and extrapolate that for a year. Here are some ideas for you:
If you use 4 rolls a month, that works out to 48 per year. Buy four 24 packs and you are set for two years without buying toilet paper. Make sure you store it in a cool, dry place.
This is rather self-explanatory, but you should be wary that powdered detergent only has a shelf life of 6 months, so consider using liquid detergent.
Colgate recommends a maximum of two years shelf life, so don't buy too much
You can really go hell for leather with this, just consider if you have the right conditions in your house to store them long term.
If you are curious about getting started buying precious metals, check out Schiff Gold for Americans, and Perth Mint or ABC Bullion for Australians. If you are worried about inflation, you should steer clear of buying gold/silver ETFs as you don't have the security of physical metal, and it can be easily seized by the government. With a gold broker like the above, you can store it at their secure facilities and request redemption at any time. I don't keep any physical metals as I don't have anywhere to securely store it, but a bank safe deposit box would be a good alternative.
Good luck out there everyone.
]]>What I find striking about this decision is that the court ruled that Trump must allow users to engage with his posts on his personal Twitter account, in spite of the fact that they could still engage with him on his official government account (@POTUS). Given that the court decided that public officials must allow all users to engage with them, it would stand to reason that public officials shouldn't be excluded from using the new public square. Which is why I find it terrifying that Twitter was able to de-platform Trump before his presidential term was over, and Laura Loomer was denied access to Twitter in spite of her being selected as the Republican nominee for her congressional race. If Twitter truly is the public square, then why are they allowed to permanently ban American citizens and elected officials for expressing opinions Twitter disagrees with? Are they not violating those banned user's first amendment rights to engage in the public square?
Trump's appeal to the 2nd circuit court decision I mentioned above recently reached the supreme court. This could have been an important moment as it's one of the first Big-tech censorship cases that would have been heard by the supreme court, but alas, the case was dismissed. Clarence Thomas concurred with the dismissal of the case but left some golden nuggets of wisdom in his statement.
Thomas said "this petition highlights the principal legal difficulty that surrounds digital platforms — namely, that applying old doctrines to new digital platforms is rarely straightforward. ... some aspects of Mr. Trump's account resemble a constitutionally protected public forum. But it seems rather odd to say that something is a government forum when a private company has unrestricted authority to do away with it". Indeed, how can you say that Twitter is a constitutionally protected forum when a low-level employee - whose labour is potentially outsourced to a contractor in a foreign nation (as is the case with Youtube and Facebook) - has the ability to temporarily or permanently suspend the account of American elected officials? Rep. Marjorie Taylor Greene was recently suspended from Twitter for an Easter Sunday post saying "He is Risen". Twitter later claimed that this was done "in error". How is a rogue employee allowed to exercise that much authority over the speech of an elected official?
This is only scratching the surface, as there is also the matter of big tech firms acting in concert to ban people from all platforms simultaneously, presumably so the banned users are not able to redouble their support on other platforms and prepare to mitigate the loss of audience. In 2018, Alex Jones was banned by Facebook, Apple, YouTube, and Spotify on the same morning. Surprisingly, the only major platform not to ban him at the time was Twitter. These firms are the modern day Robber Barons, but instead of colluding to fix prices, they collude to exclude people from access to mass communication. YouTube actually has a 3 strike system to give users some warning when they're close to being removed from the platform, but they have no problems circumventing this moderation program to remove people to make political statements. YouTuber Mumkey Jones was given 3 strikes in a matter of minutes on content that had previously been moderated and found to be suitable content. Stefan Molyneux didn't even get the 3 strikes, he was just plainly removed. Keep in mind, these bans are permanent. In our society, even some murderers are given a path to redemption.
"Today's digital platforms", Thomas continues, "provide avenues for historically unprecedented amounts of speech, including speech by government actors. Also unprecedented, however, is the concentrated control of so much speech in the hands of a few private parties. We will soon have no choice but to address how our legal doctrines apply to highly concentrated, privately owned information infrastructure such as digital platforms." He said "The Second Circuit court's decision that Mr. Trump's Twitter account was a public forum is in tension with, among other things, our frequent description of public forums as 'government-controlled spaces'".
The real meat and potatoes of the statement is where Thomas says "If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude". Our legal system, according to Thomas has "long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers". An example of such a common carrier would be telephone companies. The telephone infrastructure was built by a private company, but existing laws dictate that those companies must not be able to pick and choose customers based on their ideological bent.
If these individual companies are treated as carriers for a particular service (i.e. Twitter is the common carrier for Tweets, Facebook is the common carrier for boomer memes), then they will probably just have to stop banning users who are posting lawful content to comply, but if the service is "social media", and all social media companies are considered common carriers for "social media", I believe there's far more dire consequences for them than just having to platform ideas and people they disagree with. What happens when I sign up for AT&T and I try to call you on Verizon? The call connects seamlessly in spite of us being platformed by different companies, anywhere in the country, at any time.
What I think is next for social media companies as common carriers is that if I make a post on Facebook, you must be able to access it, and engage with it on Twitter. This essentially means, there would be some standard shape and way for sending, and storing social media content. Meaning all existing, and future social media companies (Thomas also says that "no substantial market power is needed so long as the company holds itself out as open to the public") must adhere to this standard, ensuring complete suffocation of innovation in this space. As is the case with telecommunications companies, and banks, this may also mean that Twitter and Facebook would have to abide by KYC (Know Your Customer) legislation, and user's real world identities would be tied to their accounts, and reported to regulatory bodies. That may also mean the companies must comply with counter-terrorism legislation and report users engaging in potentially illegal activities (or face large fines for not reporting it).
Essentially, only extremely large companies would have the resources to abide by the legislation, and the government will have destroyed the growth of yet another industry. I think that is a long shot, especially considering Facebook is now the largest corporate lobbyist, outspending even the telecom, and defence companies (and that doesn't include the nearly $500 million Mark Zuckerberg "donated" to fund the 2020 presidential election), but I believe it's one of the most important issues we presently face.
I look forward to seeing future supreme court cases on this topic, particularly the statements of Clarence Thomas who has consistently been the voice of reason on the bench. Clarence Thomas' senate confirmation hearings were oddly reminiscent of the Brett Kavanaugh hearings in 2017 as a report of alleged sexual misconduct was leaked to the press, and his reputation was dragged through the mud. Thomas proceeded to dunk on the committee of dickheads (lead by then Senator Joe Biden) with one of my all-time favourite quotes:
]]>This is not an opportunity to talk about difficult matters privately or in a closed environment. This is a circus. It's a national disgrace. And from my standpoint, as a black American, as far as I'm concerned it is a high-tech lynching for uppity blacks who in any way deign to think for themselves, to do for themselves, to have different ideas, and it is a message that unless you kowtow to an old order, this is what will happen to you. You will be lynched, destroyed, caricatured by a committee of the U.S. Senate rather than hung from a tree.
The analysis alone is not conclusive evidence of fraud, but we have begun to get a clearer picture of the events that took place during the early hours of November 4th - the day after election day - when Biden took significant leads in multiple states. When you combine the statistical anomalies with what I'm about to show you, it tells us without a doubt that there can be zero confidence in the accuracy of the results in these two states.
The now infamous video of the election counters "pulling suitcases out from under tables" takes centre stage in this state. This footage broke and was summarily "debunked" by leadstories, however, their fact check leaves a lot to be desired.
On the suitcase claim they said "The officials said the ballots seen in the video were in regular ballot containers -- not suitcases". Yes, they are in fact regular ballot containers, and in typical deboonker fashion, they opt for playing linguistic games for the purpose of slapping politically motivated fact checks on Facebook posts. They claim this was not in fact illegal ballot counting as the witnesses left of their own volition, however, two of the witnesses signed affidavits saying they were asked to leave as counting had concluded for the night. Also, it's worth noting that the attribution of this fact check has been switched from the "journalist" who wrote it originally to the big man Alan Duke. Alan, was it worth it? Do you feel good about yourself now you gave up your credibility to sell fact checks to Facebook?
Regardless of them disputing the fact this was legal ballot counting, we know for sure they DID count ballots, as we have the footage. According to The Epoch Times they stayed back and counted from 11pm to 1am. Imagine my shock when I found out that corresponds pretty closely with the 1:34am update that the VIP identified as the third most anomalous update in the set. That update had 136,155 votes for Biden, approximately 11x the current margin of victory.
In the early hours of Nov 4th, 2020 (why is it always the early hours when all these oddities occur?), we learned that Claire Woodall-Vogg, the election official in charge of collecting the USB drives from each tabulator managed to forget a USB and leave it at the voting precinct. These USBs contain all the votes for the election unencrypted, and in .csv format...
In her statement about the incident Woodall-Vogg said:
On November 4th, around 3:00am, the City of Milwaukee finished counting absentee ballots, and I began to export the results from Tabulator 7. Tabulator 7 was the last to finish processing ballots and was the only remaining flash drive to be burned. As I burned the flash drive, which can take up to 10 minutes, Milwaukee County Election Commission Director Henry asked that I bring a report for each tabulator regarding the number of ballots processed per precinct.
Woodall-Vogg then processed the reports as requested, and delivered the USB drives to the Election Commission, and she discovered that she had left the USB drive in Tabulator 7! I'm shocked! She continued, "I immediately called Kimberly Zapata, a member of my senior leadership team, who was still present at Central Count and confirmed it was still in the machine". She just admitted that she left the USB drive unattended in a building where poll workers were still working. Immediately apparent, the chain of custody has been broken. There is a reason all paper ballots, after election day must be sealed in bags with special security tags that break if removed. There is no assurance that the USB was not tampered with, and there can be no such assurance without an independent forensic audit of the USB device to determine if the votes had been manipulated (Woodall-Vogg claims the export timestamp is the same, but excuse me for not believing the classic "we investigated ourselves and found no evidence").
The letter continues, "Ms. Zapata gave the flash drive to a Milwaukee Police Department Officer who delivered the flash drive approximately 10 minutes later". Her statement begins at 3:00 and she says 10 minutes twice. Accounting for travel time, that would place the time she recovered the USB at roughly 3:30am (rough approximation). Assuming she went directly to upload the votes, wouldn't you know, that corresponds almost exactly with the anomalous update identified by the VIP at 3:42am where Biden took the lead. See the graph below, and you'll be able to see the update immediately. Hint: It's the part where Biden overcame a 100k Trump lead in a single update.
So, people of Milwaukee, your election officials think you are stupid. They are expecting that you won't be able to draw the connection between this extremely suspicious vote count update, and a KEY election official losing a USB containing votes. This person single-handedly destroyed the integrity of your election. At best, she made an honest mistake and someone took advantage of it to manipulate the votes, and at worst, she was involved and assisted by crafting this story and lying to the public.
To make matters worse, Joe Biden won a record low of 14 counties in Wisconsin while still winning the state. The next closest to him in terms of the lowest number of counties won is Obama in 2012 at 35. Demand a good faith audit now, or the next four years will be a dark cloud over the executive branch.
]]>the build process has always been dodgy for me,
the watch (i.e. gatsby start
) failed after being up for a while
builds didn't work on Windows Linux Subsystem
overburdened with configuration modules
The Lighthouse audit results after my first round of changes
The biggest selling point for me is the getStaticPaths
function in the Next.js pages.
Before, as a pre-build step, I was generating the entire page tree of React components using a node script. Super heavy handed, and I'm sure
there's better ways to do it in Gatsby. What I'm doing now looks like this:
.
└── pages
└── blog-posts
└── [year]
└── [month]
└── [title].tsx
The resulting output is visible in the address bar in your browser. Blog posts routes look like: /blog-posts/2020/08/some-name
[title.tsx]
export function Post() {}
export async function getStaticPaths() {
const blogPosts = await getBlogPosts();
const paths = blogPosts.map(
post => `/blog-posts/${post.year}/${post.month}/${post.title}`
);
return { paths, fallback: false };
}
In the getStaticPaths
function you return a list of new paths and Next.js automatically spits those pages out. At
build time, you can then use the path parameters to fetch external data and build your components. What this means, in
effect, is that your /pages
folder no longer maps 1:1 to the static output. So you can't just build a sitemap off
the page directory anymore.
There's a comprehensive article on the topic by Lee Robinson (https://leerob.io/blog/nextjs-sitemap-robots) but this guide also assumes your source pages are 1:1 with the expected output. I adapted his script to build based off the folder output instead.
yarn add -D glob [chalk] [prettier]
import glob from 'glob';
import fs from 'fs';
import { red } from 'chalk';
import prettier from 'prettier';
import prettierConfig from './.prettierrc.js';
(() => {
// default next js output is `out`
// all the pages are guaranteed to be html
glob('./out/**/*.html', (err, files) => {
// If there's no files in the output, a build probably hasn't been run
if (!files.length) {
console.error(red('Could not find output directory'));
process.exit(1);
}
const sitemap = `
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
${files
.map(page => {
const path = page.replace('./out', '').replace('.html', '');
const route = path === '/index' ? '/' : path;
return `
<url>
<loc>${`https://{Your Domain Here}${route}/`}</loc>
<changefreq>daily</changefreq>
<priority>0.7</priority>
</url>
`;
})
.join('\n')}
</urlset>
`;
// Optional: you can remove this block if you aren't using prettier
const formatted = prettier.format(sitemap, {
...prettierConfig,
parser: 'html'
});
fs.writeFileSync('./out/sitemap.xml', formatted);
});
})();
package.json
{
"scripts": {
"start": "next start",
"build": "next build && yarn run build:sitemap",
"build:sitemap": "node ./generate-sitemap.js"
},
"devDependencies": {
"chalk": "^4.1.0",
"fs-extra": "^6.0.1",
"glob": "^7.1.3",
"prettier": "^1.18.2"
}
}
That's pretty much it for my implementation. You can see my sitemap here https://lukeboyle.com/sitemap.xml.
]]>I realise that this is one of the most well-explored topics on the privacy-conscious edges of the internet, but seriously... Do not trust Google. Facebook seems to be our current punching bag of choice because of their supposed ability to manipulate political opinion, but in my opinion Google is a much more insidious company with far greater potential for abuse. Google is the largest advertising platform by a significant margin (accounting for 36.3% of advertising in the U.S. with Facebook trailing at 19.3%). At the end of the day, if you delete your Facebook account, what are you really missing out on?
Google (or more specifically Alphabet Inc.) owns the largest search engine (Google.com), the largest video streaming platform (Youtube), and the most-used smartphone operating system (Android). You might ask, "What's wrong with that? Sounds like they're just very successful at what they do". Well, let's break down those three markets (Search, Streaming, Mobile).
Google's estimated market share for search traffic globally is 92.16% source. As people increasingly are using search to navigate the web (as opposed to typing a URL into the address bar), this traffic increases, those people see more ads, Google makes more money. Google then uses this money to purchase exclusivity agreements with the likes of Apple (just two years ago it was announce that Google would be paying $12 billion US to Apple to remain the default search engine on Safari in 2019 source at a cost of roughly $10 per user).
If you ask the average user how Google search works, they'd probably say "it just searches for your search term across the web", but what they probably don't know is that is just the tip of the iceberg. Other dimensions of search include:
There's certainly an argument to be made for suppressing some search results, such as pro-authoritarian sites (e.g. communist or fascist), extremely fringe conspiracy, illegal pornography, or bomb-making instructions. Advertisers probably don't want their ads next to those results. Rightly or wrongly, Google is already suppressing content from such websites (though, they're probably still indexing them).
If Google can suppress fascist content from sites like Stormfront (prominent white-supremacy forum), then who is to say which content they can or cannot suppress? Breitbart is a well-known right wing news site that has had their content almost entirely purged from Google search results (as evidenced by the "search engine visibility" chart below).
You don't have to agree with them politically to see that Google is applying different standards to conservative content than to more liberal content. I don't visit Breitbart, I don't read their articles, and frankly I don't give a shit what they have to say, but I believe in a free and open internet. If you believe in a free and open internet then you have to agree this is wrong. During the cold war, anyone who didn't follow the extreme protectionist beliefs of the time were shouted down as communists (Even Martin Luther King Jr. was dismissed as a communist by J. Edgar Hoover). This same thing is happening now, but the buzz word is different. The new weaponised word is "Nazi". If time had elapsed differently, I have no doubt that it would be left-wing websites suppressed in search results, and that still wouldn't be okay.
There's plenty of evidence to suggest that Google is manually making these decisions to block conservative websites, however, Alphabet CEO Sundar Pichai denied that they manually censor websites at the recent Congressional antitrust hearing except for in cases where there are legal requirements or copyright issues. I don't buy that, personally.
When YouTube was founded it was facing severe scaling problems (because video processing and streaming is extremely expensive). Fortunately for them, Google saw potential in the platform and purchased the company for $1.65bn in Google stock, and their money issues were over. Google was throwing money into scaling the platform, and it was experiencing great growth. This success turned out to be a major problem for the YouTube, because, from the time it was purchased it has been making a loss. In recent years, YouTube has become profitable, however, without the bottomless pockets of Google behind it, they never would have been able to accomplish this. What incentive could Google have to take losses year after year on YouTube? Well, it turns out user data is particularly delicious. Mastercard's CEO has infamously said "data is the new oil". I personally can't wait for Facebook, Amazon, and Google to become para-military organisations in the up-coming data wars.
YouTube has essentially bullied their way into market dominance using Google's bottomless pit of money. This is problematic because it allows failing companies to cheat death. Just like a bottom-feeding fish,latching onto a whale shark and hitching a ride. As I mentioned before, video streaming is extremely expensive, so it makes sense that great cloud infrastructure is a prerequisite to success. Well, big surprise, Google offers world-class commercial cloud infrastructure with Google Cloud Platform (GCP)! Do you suppose YouTube is paying full price for their infra?
So, when you see a headline that says "Stop paying for iCloud – Google One will now back up your iPhone for free". Before obeying the shill who wrote it, you should ask yourself, "How can a company afford to give away so much storage space for free?". Well, they can't. Google simply obscures their losses with the immense revenue from Google Ads in the profit/loss statement at the end of each quarter. For more reading about this topic, Tim Bray has a fantastic article called "Break up Google".
This article is already becoming too long, so I'm just going to cover mobile quickly. As Tim Bray mentions in the article above, Android isn't really a business. The only real non-ad revenue they have is from the commission they get from app purchases and licensing fees from OEMs (e.g. Samsung, Huawei, LG). How, then, are they able to sustain hundreds of highly paid engineers and all the other non-technical staff required to support the system?
Above is a map of Android vs iOS market shares. You can see that iOS pretty much only has the dominant market share in first world countries (like USA, Canada, Australia, UK, Japan). Most of the emerging countries in the world are strongly in favour of Android because, unlike Apple, the OS is not restricted to a particular device. So, countries like India (where the number of smart-phone users has increased sharply from 199 million in 2015 to 401 million in 2020 source) that mostly purchase low-cost phones (e.g. Huawei, Xiaomi, Oppo). Emerging markets are extremely important to companies like Google partly because these countries are easier to exploit because they don't have strong legislation to protect users from predatory advertising, anticompetitive tactics, or data privacy. This is why I speculate that Mastercard is scrambling to connect refugees to the global payment network (Remember that quote from the Mastercard CEO: "Data is the new oil") and, indeed, why Mastercard forced Patreon to ban Robert Spencer for his anti-refugee sentiment.
Again, regardless of whether you agree with someone's political leaning or rhetoric, I shouldn't have to explain why it's ludicrous for people to believe that faceless, soulless corporations such as MasterCard or Google give two fucks about moral righteousness when their only servant is a number ticker on the Nasdaq website.
So, after reading all of that, I have to ask:
Why don't you route all of your web traffic through Google Servers?
To be clear, I'm not accusing Google of storing DNS logs or associating that with specific users (they claim that they don't in their terms of service), however, I think it's unreasonable to think that they wouldn't be capable of that. I also wouldn't put it past them to lie in terms of service considering their recent run-ins with the law ($1.7bn fine for anti-competitive behaviour, $170m for violating children's privacy on YouTube, 50 million euro fine for GDPR violations).
$2bn doesn't matter to Google. It's a drop in the bucket, especially considering they would probably be able to freely harvest user data for months or even years before they're caught and slapped on the wrist. If a single user's search data is worth upwards of $10 a year (see the Safari Google default search engine deal) for Google, then the complete logs of their browsing history would be quite juicy indeed.
Okay, so that's verging on conspiracy theory I suppose. Maybe Google DNS will remain clean. How about you get a Google® Nest™ WiFi mesh router and let them inspect all of your web traffic that way?
Or perhaps you want to buy the new Pixel and give them advanced analytics about how you use your phone (privacy class action lawsuit), everywhere you go (Location History), how much physical activity you do (Google Fit), every article/video you engage with (Chrome), everything you buy (Google Pay - and incidentally, how much disposable income you have, so they can better target more relevant ads to you). All of these "services" are simply a ruse so that Google can build an extremely accurate profile about the type of consumer you are and target you with more advertising to turn you into a soulless consumer.
I don't want these people to also be the arbiters of what content I should or should not be able to see online.
Well that was pretty depressing. So, how can you reclaim a shred of your privacy?
There's a swathe of privacy-focused alternatives popping up these days. I personally use duckduckgo.com which is built on the Bing search API and does not track any user data. I'll concede that Duckduckgo doesn't have as good search results, but I'm okay with that. Another one is https://www.startpage.com/ which actually uses Google results, but ensures Google can't track your activity.
I'm currently using invidio.us which, like Startpage, is just a wrapper for YouTube. So you can get the same content minus the tracking. Bonus, check out Invidition on the mozilla extension store to open all youtube links in invidio.us instead.
I really don't have an answer for this one. I'm an iPhone user, but really, Apple is not much better, especially if you care about having a repairable device. If you really want to go hardcore there's some custom Android forks like https://grapheneos.org/
I didn't really touch on Chrome, but I'm not happy with Chrome either. Since Edge has switched to using Chromium the only real competitor (i.e. non-Chromium) in the browser market share is Safari. I use Firefox because I believe in Mozilla and their commitment to maintaining privacy. They're doing good stuff lately.
]]>So how did I "increase my 1RM by 18-35%"? One simple trick! "Adherence" (and maybe a bit of residual newbie gains).
Adherence before OLAD
Adherence after OLAD
I really just owe this adherence to the renewed enjoyment I've had in the gym. It's pretty great to get out of the gym within 45 minutes (excluding prehab/rehab). I was quite fatigued by the few cycles of 5/3/1 I had just done (not to mention the 1.5 hour workouts), so it made sense I was burned out.
All things considered, I'm really happy with this program. After my second knee dislocation in 2019 I didn't expect to be squatting again but here we are. It's not the most encouraging sign when your bench is beating your squat, but I'm not giving up.
I have no recent squat footage so here's some of this weird reverse safety bar front squat thing I got from John Meadows
My progress plateau might suggest that this experiment was a failure, however, it has made the gym far more enjoyable than cranking out the same repetitive workout week in and week out. I also noticed that I end up spending far more time on the core exercise and often won't add any accessories. Workouts are overall shorter and more satisfying. I've also been able to rotate in more variations (e.g. push press, pin press,safety bar squats, deficit deadlifts) which helps with lift boredom. The next logical step from here would be to make my workouts more consistent and strategic.
The One Lift a Day (OLAD) system has gained more popularity in recent years. Eric Bugenhagen has been championing OLAD for years and Alec Enkiri recently broke down his OLAD program. Given his insane strength and general athleticism (585lb deadlift, 4.5 second 40, 60" box jumps) it's always interesting to see how his programs reflect that. I challenge you to find a cookie cutter program that includes resisted sprints. The exercise selection with OLAD is entirely up to you and should be based on your goals, but Alec suggests including a squat, hip hinge, loaded carry, horizontal press, vertical press, upper body pull.
For rep schemes and progression I turned to Dan John's one lift a day program This program is built on 4 week cycles like so:
I'm going to be doing this program for 3 months (i.e. 3 of these 4 week cycles). My exercises (with a recent set in brackets):
I'll be documenting progress for the next 3 months and we'll see if the gains gods bless me.
]]>One of the unfortunate realities of life is that people die. The music industry, eager to exploit, will often take this opportunity to cash in. They'll bastardise the hard work of productive musicians and release posthumous albums, and countless remixes of classic songs from various artists. Look no further than the discography of Notorious B.I.G, 2pac, Big L, or more recently, XXXTENTACION. The latest project from X's estate was an album released in late 2019. Out of 25 tracks, 17 tracks had featured artists for a total of 21 guest verses.
The announcement of Circles was something different. Mac was working diligently to finish this album before his untimely death in 2018. A significant portion of Swimming and Circles was "executive produced" by Jon Brion who carried the torch and tried to finish the album as Mac intended it. When I initially saw his writing credits on Swimming I was quite surprised.
I pondered the mystery of how you connect the dots between an eccentric composer who is most well known for his work on movie scores and a trendy rap artist. Turns out, it's not that much of a mystery at all. In an episode of What's In My Bag with Amoeba records Jon Brion says, "You know, my complaint about most people who make records and go out and shove their shit down people's throats is that... all I see them doing is giving me their impression of what they think they're supposed to be doing. And it's what bores me about 99.999% of people who make stuff." For a man who said that to clear his calendar and help finish Circles, he must have really been excited about working with Mac.
If you consider his work in film, Jon really does a masterful job of conveying tone. I have to thank Jon for the inspiration/encouragement he gave Mac, but in his interview with Zane Lowe, he insists "That's not something I created, that's something he was doing and I was only asking him to recognise that it was already great."
The album is littered with references to time. More specifically it seems to be about how you perceive time. Do you let time be a tyrant in your life and fight it, or do you go with the flow and ride it out? One of the greatest albums of all time is Dark Side of the Moon which also featured references to time very prominently. Dark Side of the Moon starts and ends the same way, with a heartbeat. This seems to represent the cycle of life: a notion that is paralleled in the opening track Circles. The line goes "I just end up right at the start of the line. Drawin' circles." Similarly, every day ends the same way it begins; the hands of the clock go around in a circle until they strike 12.
Hey, one of these days we'll all get by
Don't be afraid, don't fall
After the very sombre start, the synth and funky bassline (shout out to MonoNeon) of Complicated was fairly jarring on first listen. After going back through for a second time, it really just seemed to make sense. The lyrics are very dour which contrasts beautifully with the rather upbeat and playful instrumental. Many songs use this tactic like a trojan horse to insert some meaning into poppy songs, because they probably wouldn't top the Billboard 200 if the instrumentals matched the lyrics. (See Hey Ya by Outkast). A recent example is in the Purple Mountains song All My Happiness Is Gone. The late David Berman said in an interview "it just complexifies the profile of it to have the music and the words at odds". As it turns out, Complicated wasn't originally made for the album[1], but it fits beautifully. Leading on from Come Back to Earth on Swimming where he says "I just need a way out of my head", Complicated has the first reference to his difficulties unravelling the mess in his head which ties into Good News later in the album.
With Circles and Complicated, I get the impression that Mac was living in the present which is a comforting way to approach life when you're going through a hard time. Lyrics like "I've got all the time in the world, so for now I'm just chilling" and "'fore I start to think about the future. First, can I please get through a day?". He's taking life one day at a time and working through the "clutter" in his head. The downside of this approach is you can easily forget about the bigger picture and end up clinging to destructive coping mechanisms.
Won't give a fuck about tomorrow if I die today
Aside from a few leaks like "Telescope" (which became Woods) and Once a Day (which was played during his A Celebration of Life as a straight piano ballad), Good News was the only music released to promote the album (perfect choice). Mac opens the track talking about fighting his demons ("I spent the whole day in my head, Do a little spring cleanin'"), but often feeling hamstrung by his own instincts to self-sabotage. He says, "I wish that I could just get out my goddamn way", and "Why I gotta build something beautiful just to go and set it on fire?"
With a different interpretation, these lyrics could have been delivered with a very moody inflection to create a much darker tone, but the muted string plucking and sparse instrumentation give this very calming, ethereal feeling. This track also features guitar from Wendy Melvoin (guitarist for Prince's band, The Revolution), considering that John Mayer played guitar on Small Worlds, it's really clear how infectious Mac Miller's talent was. These great musicians happily collaborated with him with very little public recognition. Paired with the subdued vocals it is just a beautiful tribute to his life and legacy and a reassuring reminder from the great beyond.
Especially when coupled with the music video, it has a surreal quality that feels like he's in the room talking to you. It almost sounds like he's reassuring people from the great beyond with lyrics like "There's a whole lot more for me waitin' on the other side". It's a story of the immortal quality of music. So, it only makes sense that an image of Mac appears in a Lotus flower which has a symbolic meaning in Buddhism. It resembles the purifying of the spirit which is born into murkiness[2]. The ending of him walking through the airplane window and disappearing as a ripple in the water is superb and continues to drive home the idea of swimming. Good News is the best send-off anyone could hope for.
I was drowning but now I'm swimming
Woods is my favourite track on the album. It's probably the most subdued track with very sparse lyrics and instrumentation. The opening lines of "Things like this ain't built to last, I might just fade like those before me" could be interpreted a few ways. This album was recorded shortly after the loss of a relationship. It feels like saying "If you were able to forget the people you used to love and love me, then surely you'll be able to forget me and move on". Another interpretation could be his anxiety about his legacy and whether he did enough to be remembered.
Hand Me Downs is a testament to Mac's commitment on his last two albums. In my opinion the best albums seldom have (many) artists featured. To me, it exudes a lack of confidence in what you're producing and a level of insecurity that you feel as though you need to attach bigger names to your songs for them to be well liked. The feature on this album by Baro Sura was a very purposeful choice and it only seems to serve the album's narrative and vibe. And with Baro being a relatively unknown artist it feels like this was someone Mac really liked and believed in, and he delivered a great performance.
I've written before about how I like albums the most when there's a consistent vision from the lyrics to the production. Out of all of his studio albums, Mac seemed to take care of around 30-50% of the production and outsource the rest of it to other producers[3]. Circles, however, sees a staggering 75% of production handled by Mac. For whatever reason, the production is credited to Mac Miller, not his oft-used production moniker "Larry Fisherman". I like to believe that's because this send-off is a time capsule for us to see who Malcolm really was around the time of his death. A truly honest expression of Mac the musician.
Hands takes the metaphor of time further. To me, the beat sounds reminiscent of a clock ticking, and hands seems to refer to the hands of a clock. It seems to be a letter to himself urging him to ease up on himself and stop feeling so low. "When's the last time you took a little time for yourself?" and later in the chorus "No, I stay behind the wheel and never half-speed". "Never half-speed" might suggest that he is always speeding and might benefit from slowing down. This hearkens back to the line on Small Worlds (Swimming): "I'm always in a rush, I been thinking too much".
Throughout the whole album it feels as though there's two versions of Mac. There's the self-reprimanding Mac with unrelenting standards (the Yin, if you will) and the Mac that is more forgiving and reinforcing that his imperfections aren't the sum of his existence (the Yang). If he could just achieve a balance between these two sides then maybe life would be just a little bit less complicated. This concept of Mac's duality could also explain why the album art features two images of himself superimposed.
Yeah, why don't you wake up from your bad dreams?
When's the last time you took a little time for yourself?
Strangely, it feels as though any track on the album could have been the last track and it wouldn't be lacking. But what's important is that the final track represents the last goodbye. The important thing to nail is the tone. Jon Brion made the decision to put Once A Day at the end, and it was a very intentional decision. What message did he want to leave us with for the final act of Mac? Once a day opens with a monologue "Once a day, I rise. Once a day, I fall asleep with you". He's really talking to himself. You spend more time with yourself than anyone else and it's important for you to be comfortable with that. "Don't keep it all in your head. The only place that you know nobody can ever see". Once A Day is a tale of inner peace and a final reminder that we need to stay open, not just to others, but most importantly, to ourselves.
The painful part of listening to this album is the feeling of finality. There will never again be a Mac Miller album that was wholly - or even partially - designed by him. The circle is often seen as a symbol of permanence and immortality, like the Ouroboros (a snake eating its own tail) in Greek mythology. So, it's only fitting that the album to solidify his legacy in our minds is Circles.
Circles feels like a realisation that life isn't a comedy but it's more of a tragedy with comic relief. True maturity is welcoming that reality but recognising that sometimes the best you can do is keep Swimming in Circles.
It has been a couple years since Gitlab's rise to prominence and the market has certainly shifted. Even before Github was acquired by Microsoft in Mid 2018 (source), they were hard at work pushing out feature after feature.
Off the top of my head, I can recall these:
Github actions is now in open beta (you can opt in here: https://github.com/features/actions) and it enables you to set up containerised builds, testing, deployments in response to many github events (push, pull requests, tags, schedule).
The process is much the same as something like CircleCI, Travis, or Buildkite. The integration for CI checks on pull requests and commits has been in Github for years, allowing early warning for pull requests that break the build.
In this post I'll be showing you how to set up to build and release a single-page app running React.
Keep in mind that the v1 Github Actions syntax has been deprecated, so make sure you are looking at the yaml documentation. There's a handy warning at the top of the deprecated pages:
The documentation at https://developer.github.com/actions and support for the HCL syntax in GitHub Actions will be deprecated on September 30, 2019. Documentation for the new limited public beta using the YAML syntax is available on https://help.github.com.
Find the docs here: https://help.github.com/en/categories/automating-your-workflow-with-github-actions
For this example, I'll be using Create React App. Initialise that if you'd like to follow along, or just retrofit an old, simple project.
There's two flows I want to create
Let's create the action file.
Create a folder in the root of your repo .github/workflows
Create a file in that folder called ci.yml
Let's look at the ci.yml file and add some boilerplate
ci.yml
name: CI
on: [pull_request, push]
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@master
- name: Use Node.js 10.x
uses: actions/setup-node@v1
with:
version: 10.x
- name: Build
run: |
npm install
npm run build --if-present
The first thing to note is on line 3, there is an option called on
(docs for on
. This field is a list of signals you want to respond
to. For this one, I'm only doing it on pull request. Because this on
property is at the top level, regrettably you
can't combine all your steps and choose not to run some steps on pull request. This is the reason for having two
separate action files. In principle, the actions should be entirely self contained processes.
The jobs is a list of independent actions. By default, they run in parallel. You could use this to separate things like your unit and integration tests to speed up your CI. This example is pretty simple, so I haven't found a use for the jobs yet.
The steps field is quite simple in this example. For each step, you can chose to specify the uses
field (docs).
The format for this argument is [owner]/[repo]@[ref]
or [owner]/[repo]/[path]@[ref].
. You can reference actions in
your current repository or you can reference standard actions as per the example above.
actions/checkout@master
checks out the current branch. actions/setup-node@v1
sets up Node, probably
through a Docker container. You can provide arguments to the action using the with
key.
Now, the magic begins. Go to your repository and visit: https://github.com/[yourName]/[yourRepo]/actions
. You'll be prompted
to enable Actions for this repository. Hit enable and then commit your ci.yml
file, push it up and check the Actions tab.
You should begin to see your commits start popping up under the relevant action.
In the image below, you can see the left side has the name of the action, the event that triggers it, and the jobs below that.
With luck, we now have our CI build successfully running. Onto the deployment action. Copy the below to your ci.yml
ci.yml
name: CI
on:
pull_request:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@master
- name: Use Node.js 10.x
uses: actions/setup-node@v1
with:
version: 10.x
- name: Build
run: |
npm install
npm run build --if-present
- name: Deploy
if: github.event_name == 'push' && github.ref == refs/heads/master
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET }}
run: node scripts/deploy.js
You'll note that at the moment we're executing this on both:
This means that unless we add a filter, we'd be deploying branches on any pull request, which could probably break our app.
To the Deploy
step, we've added an if. This if should have a boolean
value that will determine whether to run the step or not.
You could do things like check if a step was successful, or in our case:
Moving onto deployment, if you look at the env key, this is how we
provide environment variables to the step. These are accessible in
node scripts via process.env
. SOME_API_KEY
in this example is a
hardcoded string. Github also provides a secrets manager within your
repository. Don't worry about that node script yet.
At a previous job, they outlawed all external CI services because they were worried about their AWS IAM keys getting out in the event of a CircleCI data breach. Given that we're dealing with Github + MSoft, I have to believe there's some encryption magic happening when you upload and access these secrets. Once you've set the value in the secrets, you will not be able to see it again and it will only be exposed to the CI agent.
I tried to log one of these secrets, and cleverly, it was censored in the logs (see below). Gone are the days of having to rotate your IAM keys because you accidentally logged it in your CI or Cloudwatch.
I'll come back to those AWS secrets shortly. From this point, all we have to do is deploy. I'm going to offer three suggestions:
I would argue that S3 is superior to Github Pages. The unfortunate part of Pages is that it can only serve from files in the repository, so you have to commit your built files in order to host. However, Pages are free forever, unlike S3 sites which will begin to cost if you start having significant traffic. If performance is a concern for you, look elsewhere as neither of these are going to be blazing fast.
I'd suggest going with Github pages for simplicity as you'll avoid setting up an additional account (and potentially save $$).
Most sites I make are not under high demand, nor do they have many concurrent users, so for my purposes, S3 storage is more than enough.
I also use Cloudflare to cache the assets, so the majority of sessions download assets off the Cloudflare CDN, rather than S3, so my usage stays very low for S3. This also has the benefit of using Cloudflare's smart routing to make my Sydney hosted S3 bucket much faster for international users.
See the example repository here: https://github.com/3stacks/github-actions-react-s3
First I'll quickly go through how to get your S3 bucket and IAM keys and be a bit responsible in the process.
Create Bucket
and give it a url friendly name the same as the domain you will use for.Block all public access
checkbox{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-arn-here/*"
}
]
}
With this policy, any user that queries can get any object in the bucket, so please, don't store anything private in there.
index.html
We're going to start by making a policy that is our deployment policy for this bucket. It ensures that if the keys to an IAM user leak all you'll be giving away is access to that single bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::your-arn-here.io"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": ["s3:PutObject", "s3:GetObject", "s3:DeleteObject"],
"Resource": "arn:aws:s3:::your-arn-here.io/*"
}
]
}
Programmatic Access
Attach existing policies directly
AWS_ACCESS_KEY_ID
and copy the corresponding value from your newly created IAM userAWS_SECRET
Now your Github Action will pick these up in ci.yml
. Copy the contents
of the deployment script from here: https://github.com/3stacks/github-actions-react-s3/blob/master/scripts/deploy.js
to a directory (./scripts/
is what was defined in ci.yml
, but you
can change this if you prefer a different directory). Make sure you update
the S3 bucket name on line 24.
Your ci.yml
workflow should resemble the below:
ci.yml
name: CI
on:
pull_request:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@master
- name: Use Node.js 10.x
uses: actions/setup-node@v1
with:
version: 10.x
- name: Build
run: |
npm install
npm run build --if-present
- name: Deploy
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
env:
AWS_DEFAULT_REGION: ap-southeast-2
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET }}
run: node ./scripts/deploy.js
Ensure you set the region you would prefer in the deploy env.
Now we're done! Commit those changes, push it and you'll see the build run and deploy your app.
Visit http://[bucketName].s3-website-ap-southeast-2.amazonaws.com/
to verify.
From now on, commit on master and your code will be deployed automatically.
See the example repository here: https://github.com/3stacks/github-actions-react-pages
Visit https://github.com/[yourName]/[yourRepo]/settings
and scroll to the Github Pages section.
Here you may enable github pages on the master
branch or gh-pages
, root folder (i.e. you build into root directory) or master
branch /docs. I prefer to use a separate branch as it's generally
advisable to keep your master branch clean of build files.
To enable the gh-pages
branch, the repo must already
have one. In your terminal, do the following:
git checkout -B gh-pages
git push origin gh-pages
Back in your browser, select the gh-pages
branch in the Pages
dropdown (See below):
From here, deployment is fairly painless. Let's take advantage of the Actions ecosystem Github is building and use: https://github.com/marketplace/actions/deploy-to-github-pages?version=1.1.2, an action written by James Ives.
First we have to generate a personal access token.
Click Generate new token
Select the appropriate scopes. We only need repo related scopes (below)
Do not share this key with anyone. It has access read/write access all your repositories *
Add the secret as per the Storing and using the secrets section
above, calling your access token secret GITHUB_ACCESS_TOKEN
Back in ci.yml
,
name: CI
on:
pull_request:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@master
- name: Use Node.js 10.x
uses: actions/setup-node@v1
with:
version: 10.x
- name: Build
run: |
npm install
npm run build --if-present
- name: Deploy to GitHub Pages
uses: JamesIves/github-pages-deploy-action@1.1.3
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
env:
ACCESS_TOKEN: ${{ secrets.GITHUB_ACCESS_TOKEN }}
BRANCH: gh-pages
FOLDER: build
Our secret and other required arguments will be provided to the Pages
Deploy action using the env
key.
Due to the way the routing is done in github pages, assets referencing
/
will go to the root of your Pages (e.g. https://3stacks.github.io
).
This means none of the assets in CRA will be loaded. To get around this,
in your package.json
, add "homepage": ".",
. This will make it resolve
correctly.
Now we're done! Commit those changes, push it and you'll see the build run and deploy your app.
Visit http://[yourName].github.io/[repo-name]
to verify.
From now on, commit on master and your code will be deployed automatically.
COMING SOON - This section is not complete
Github Actions also supports using specific Docker containers
from Dockerhub. So if you have complicated dependencies, you can
choose to utilise this option. Use the uses
key and give it a path
in the format of: docker://[image]:[tag]
https://help.github.com/en/articles/configuring-a-workflow#referencing-a-container-on-docker-hub)
]]>YNAB is a very interesting take on budgeting. I used to swear by my old way of using a spreadsheet, but it sort of falls apart when your pay is irregular (like for self-employed people or freelancers with variable income). You can connect your bank accounts for automatic transaction feeds but I prefer doing it manually as it seems to make you more mindful about your spending.
I'm a real metric head, so I appreciate some good graphs.
Age of money tells you how long between getting paid do you spend your money. It's very encouraging to see yourself breaking the cycle of pay-cheque to pay-cheque.
Net worth is pretty self explanatory. It tracks your assets versus your debts and gives you a nice net worth graph over time.
You also get a categorical breakdown of your spending which you can click into to see more specific information about each category
The bulk of their blog posts are not YNAB specific, but include general advice for budgeting, so if you're struggling, it may be helpful for you.
If that sounds appealing, there's a link below which includes a referral (if you sign up I get a free month. If you aren't cool with that, just search for YNAB). They offer a month long free trial if you feel like giving it a shot.
According to their website, Cloudflare now powers nearly 10 percent of all Internet requests. I've been using them for a few years now and I'm still in awe of them. First of all, when I started using them I was still paying for SSL certificates, then here comes this start-up that offers DDoS protection, SSL and caching and it's free... Where's the catch? I do find it somewhat suspicious that they're able to offer these services for free. Presumably the money they make off enterprise accounts offsets the usage at the free tier.
The DNS settings are really easy to use too. I use the analytics on this site and it seems to block a few threats a week.
They are now offering domain registrations which I haven't taken advantage of, but they seem to be cheaper than your run of the mill registrar.
They also do a lot of very interesting technical writing. Curious why their office has a wall covered in lava lamps?
Technical version - https://blog.cloudflare.com/lavarand-in-production-the-nitty-gritty-technical-details/ Non-technical version - https://blog.cloudflare.com/randomness-101-lavarand-in-production/ Article about it - https://www.fastcompany.com/90137157/the-hardest-working-office-design-in-america-encrypts-your-data-with-lava-lamps
Check them out: https://www.cloudflare.com/
Password managers are certainly rising in popularity and it's a good thing. The password is a very flawed authentication method especially when you re-use the same weak password across multiple sites. If you can instead remember one very strong password, you'll be able to generate strong, unique passwords for every service you use. They also now have built in support for the Google Authenticator protocol with TOTP tokens.
I really like the mobile and desktop apps and they have recently released a browser only client. Along with their provided cloud sync options, they also offer personal cloud storage syncing.
You can also enable travel mode for when you're overseas which stops syncing sensitive vaults.
I use the shared vaults a lot to share with coworkers.
I'm planning on keeping this list updated should anything change, so keep your eyes on this post.
]]>"Agile" in its current sense appears to be derived from the Agile manifesto, however, agile practices have roots through the last 4 decades of programming history. Recently I read the Mythical Man Month (Brooks, 1975) and in it Brooks extolls the virtues of things like disposable prototypes, testing as you build, and always having a working program.
One of the most recognisable and user-friendly explanations of this concept is "The Agile Bicycle" illustrated by Henrik Knilberg
This is a great example of delivering a minimum viable product (MVP). There are many benefits to this method:
Regardless of how rough around the edges your product is, if it is functional, then people can use it. It may not have the appeal to gain significant traction, but you can start getting at least some ROI, and - perhaps more importantly – user feedback. If a product is fundamentally flawed, it should be visible at any stage. According to Brooks, an incremental build method is better because:
The most important part of that is that while we may not deliver the full feature set at the initial release date, at the very least, we’re not going to be giving people a car without a steering wheel.
So, how does a company selling pre-packaged meals relate to software MVPs?
I’ve been using them for around a year. I picked them because unlike similar competitors, they offered meals with higher calorie counts for a similar price point. My first delivery came in an unmarked Styrofoam box. Styrofoam is good at insulating contents; however, it requires specialised machinery to recycle and takes untold millions of years to degrade, it’s not a great material. The meals came in take-away style containers with a sticker slapped on which were easily broken in transit and they were all frozen. On the technical side, subscriptions were not manageable by the user and had to go through customer service, which added some friction. It wasn’t a mind-blowing experience, but the meals all tasted good and most importantly, the business model worked.
Over the last 12 months I’ve observed various improvements to their offering.
While people starting to use them now will see the last year of enhancements as the norm, people who have been using it for a longer period will have gradually had improved service, thus increasing satisfaction. Rather than overreaching and increasing the risk of being crushed by their overhead, My Muscle Chef took an iterative approach and gradually built a loyal base of customers which enables further innovation.
In my eyes, iterative development is inarguably superior to traditional waterfall project management where oftentimes budget, schedule and feature set are inflexible. As the saying goes, "you don’t know what you don’t know", and as such, progressive discovery will often prove many of your initial assumptions incorrect. It’s very refreshing to see companies with more tangible products embracing Agile principles and prospering. As they say, the proof is in the pudding.
To be clear, I am in no way affiliated with this company, I just like eating their food. If you do end up signing up, consider using my referral code (S1HKD51IM) and we’ll both get $15 credit. Love those free meals.
]]>I was drowning but now I'm swimming
In an interview with Zane Lowe, Malcolm said "Do you ever feel invincible? I lived a certain life for 10 years and faced almost no real consequences. I had no version of the story that didn't end up with me being fine". He had recently been arrested for crashing his car while under the influence of alcohol. He took this as a wake-up call, but it appears it was too little too late. Perhaps if he had faced consequences sooner, he wouldn't have been allowed to fall so far into the hole.
Fame is a double-edged sword and for Malcolm being 20 years old with a Billboard topping debut album, he was thrown into the spotlight and things really didn't stop for him since then. Imagining myself at 20 becoming wealthy and famous, I doubt there's any chance I would exercise any level of restraint. At that time, being known as a producer of "frat rap", it almost seems like a self-fulfilling prophecy that that would lead to out of control partying and substance abuse. The entertainment industry has a habit of dragging people in and beating the shit out of them. When you consider the story of Avicii and the adversity he faced essentially being forced to tour and perform even when he was begging his manager to cancel the shows, it's easy to see why so many people don't make it. Other artists such as Deadmau5 and Earl Sweatshirt were able to see the warning signs and they took breaks to take care of themselves. This year Earl Sweatshirt cancelled a tour to Australia following the death of his father. With any luck he will emerge having dealt with his grief healthily and be better for it.
Not everyone is so lucky, though. Look no further than the 27 club; a sprawling list of people who likely garnered significant fame in their late teens or early 20's but didn't manage to see the decade out. It is a systemic issue that doesn't have a clear root cause. Is it the idolisation of relatively young adults? Or is it a result of an abusive industry that chews people up and spits them out?
It was as early as 2012 that Malcolm spoke about how he didn't want to die of an overdose, but how sobriety was just boring. In the few years after his career took off, it was clear to his fans that he was not doing well. 2015 marked the release of GO:OD AM, he took the opportunity to check in and let everyone know he was doing alright. The dark period was seemingly over. Contrary to what most people will have you believe, there is a voluntary element to depression. There's a very fine line between living with depression and living in hell. If you allow your self-loathing tendencies to consume you, you will be in hell. Alcohol is a well known depressant, but it's hardly the only thing people commonly abuse. I don't imagine anyone comes out of the other side of an opiate high and thinks "wow, that was an awesome time, I feel great about doing that". Giving in to substances is just one way we sabotage ourselves.
Life is full of peaks and troughs and it's a true tragedy that Malcolm (and so many other people for that matter) didn't make it to their next peak. For me, this is a lesson about the fragility and the tragedy of life. The pain that we feel from his passing will eventually be eclipsed by the gifts of his music and positive energy. The world has changed as a result of what Malcolm accomplished during his short life. I think it's well worth waking up in the morning and being a part of it.
I don't think I'll ever understand why his death affected me so much. It's quite bizarre how connected we as fans were able to feel with him, never having met him, but I think that's just a testament to the artist and person he became. From a frat rapper to a soulful musician expressing himself honestly and uninhibited, he touched fans and artists alike. We are truly fortunate to have had the opportunity to listen to his magnum opus Swimming.
I encourage you to watch the Mac Miller: A celebration of life concert and to donate to the Mac Miller circles fund
Rest in peace Mac Miller.
]]>At the time, I didn't want to sink a lot of time into it, so WordPress was identified as the path of least resistance. I used Bedrock by Roots to version control my plugins and WordPress with Composer. It was working well and was quite fast (for a WordPress website), but it still suffered from a fairly fundamental issue of not being able to version control content. WP apologists might tell you to store your database dumps in your repo, but to them I say; "yeah, nah". If you ever have the misfortune of looking at a WP database dump, you'll realise there’s about a billion lines of muck which is totally irrelevant to the content and composition of your website and I don’t particularly like the idea of storing my users table in a public git repository anyway. In spite of my whinging, the version controlled content pain point was more of an under-the-tongue ulcer type of pain than a broken arm so I didn't worry about it. One day I made the mistake of upgrading the WP version on my server and I hadn't copied the install to my local, so there was a lot of out of sync content. So you can imagine I was pretty happy when I found out my login no longer worked, I couldn’t reset my password and changing the password directly in the database didn't work. I took an sql dump of the database and loaded it into my local only to find the Advanced Custom Fields don’t appear to be stored in the database, so when I salvaged the content it was totally broken.
Then it hit me. What if I get a JSON dump of my posts from the database and turn that into a static version? So, what output format would be most suitable for an archive of text posts?
Markdown was invented by notable 'f-word' writer John Gruber in 2004 and it has since become a staple in the development world. I chose to use Markdown as the output because it provides simple shorthands to represent markup so I knew I could get tidy archiving in Github that would be nicely rendered as html in the web view, but the posts would still be readable (and writable for future posts) when looking at the source. I created a node package for generating an archive and published it to npm in the hopes that it might address the problem for other people too.
Now I have my posts nicely sorted and stored in a repo, but the problem with generating an archive of Markdown files is then you just have an archive of Markdown files to deal with.
The website is built with the static site generator "Gatsby", so all pages are React components which really adds a lot of flexibility. For example, when generating blog post components I can make the title render as a link to the blog post slug but only when it appears on the front page.
The ingestion strategy is to add the blog-posts repository as a submodule so I can then update and push those independently. Then, at compile time, I would read the archive of blog posts and generate:
The script that is responsible for this is really something to behold (you can see that here). The process is such that all markdown files are grabbed from the archive, then for each post, the script will parse out a metadata table in the top of the file that has the post title and whether or not it is a draft. That post is then passed to the markdown renderer and we generate a blog post component with that rendered content. That blog post component is then given its own page component and it’s stitched onto the aggregate blog post list. The blog post list is then parsed out into pages which are output as components and voilà. I suppose if there’s a gap for it, I could publish a "WordPress Markdown archive to React static site" package, but it may be a bit too niche.
The end result is an overall slimmer repository since all of the blog posts are stored in a different repository and the generated pages are not committed which lends itself perfectly to an automated deployment service. It also allowed for much less human intervention in the creative process.
The main caveat I’ve discovered in this transition is that I didn't have a solution for porting assets (such as embedded images) to the markdown archive. Currently, any embedded images will 404 until they are added manually. This definitely isn't ideal and if I ever get a chance I plan to package all the linked assets down into each blog post.
]]>But what if we want the term and definition to sit inline? This usage is semantically a dl, but traditionally, this has been a serious pain in the ass if you want consistent spacing between the terms/definitions. The image below exhibits a compromise I made with the designer on a previous project. Making the dt/dd inline-block works to a certain degree, however, when setting widths explicitly you will have serious issues going down the breakpoints. The display:block
span just forces the content to stay in it's respective line. This, however, is not correct usage, as a dl
is only supposed to have dt
or dd
elements inside it. EDIT: Since working on this project, it looks like we're now permitted to wrap a dt+dd
group in a div to control flow. So how can flexbox help us here?
_button.scss
// Define base component styles (e.g. sizing/positioning)
.button {
border: 1px solid;
padding: 6px 5px;
}
// Dark Color scheme styles
.scheme-dark {
.button {
background: white;
border-color: white;
color: black;
}
}
// Light Color scheme styles
.scheme-light {
.button {
background: black;
border-color: black;
color: white;
}
}
Although this is quite lightweight, there are still issues.
Enter the CSS Variable (the hero we need) CSS Variables are defined like so:
:root {
// Initialise the variable
--primary-color: pink
}
p {
color: var(--primary-color); // it's pink, baby.
}
The var
function also takes a second argument which is an initial/fallback value.
p {
color: var(--primary-color, red);
}
CSS Variables follow block scoping principles, so, variables defined in :root
are considered to be global variables (but may be overwritten inside specific components) and variables defined in any other element are scoped to that block of styles. This is broken down very nicely on a recent Smashing Magazine article.
I recently wrote a library to ingest variable names and values and spit them onto the root element (see the package) The idea is that each theme would have all relevant variables defined in objects like so:
const viewState = {
currentTheme: 'darkScheme'
}
const themes = {
darkSheme = {
'primary-color': {
hex: '#FFF'
}
},
lightScheme: {
'primary-color': {
hex: '#000'
}
}
}
And then when the currentTheme changes:
import syncVars from '@lukeboyle/sync-vars';
function updateCssVariablesWithCurrentScheme(colorScheme) {
syncVars(themes[colorScheme]);
}
// if we call that function with 'darkScheme'
updateCssVariablesWithCurrentScheme('darkScheme');
<html style="--primary-color: #FFF;"></html>
So, how does this help? For one thing, with this approach, I no longer have to worry about adding the colour scheme classes to the body, and I don't have to do any hacky overrides, etc. _buttons.scss
now looks like this:
.button {
border: 1px solid var(--text-color-var);
padding: 6px 5px;
background: var(--button-background-color-var);
color: var(--text-color-var);
}
Looking forward, this approach also means that custom colour themes are very nearly in reach. It also means that colour schemes could be changed on the fly. The user could have a colour swatch tool and be previewing their theme changes live. Taking it even further, it means that the colour schemes no longer need to be a part of the codebase. It could just as easily be a JSON file on the server and changes could be flexibly pushed. Why is this exciting? Say it's Christmas time and you want to get into the spirit of things... With a few string replacements you have a temporary festive theme to force upon your users.
Sites or apps could have buttons to activate color blind mode and specific 'problem' colours could be swapped out for friendly colours. Additionally, high contrast modes would be a breeze.
Users could activate alternate modes for websites to get a different experience.
CSS variables are getting me really excited because it's the first minimal overhead approach to theming in front-end only applications. This is something that will reward well structured stylesheets and result in a better experience for the user. I am looking forward to rolling out custom themes in Agander and finally getting around to making the flat UI theme I have wanted to make for some time.
]]>morpheus.wav
)
<style>
.button-padding-approach {
font-size: inherit;
-webkit-appearance: none;
border-radius: 0;
border-style: solid;
border-width: 0;
cursor: pointer;
font-weight: normal;
line-height: normal;
margin: 0;
position: relative;
text-align: center;
text-decoration: none;
display: inline-block;
padding: 1rem 2rem 1.0625rem 2rem;
font-size: 16px;
background-color: #999;
color: #000;
max-width: 170px;
}
</style>
<div>[A Button](#) [A Button that breaks to two lines](#)</div>
This approach works okay, and it's good for multi-line (buttons where the marketing team sanctioned too much copy) text. The problem with typography, is that glyphs can have descenders (as in y and j) which push the bottom of the bounds down. So if you want to properly vertically center your text you have to baby the padding so much that it becomes too much of a pain in the ass. The padding on the above buttons is padding: 1rem 2rem 1.0625rem 2rem;
. 5 significant figures for bottom padding? I don't think so.
<style>
.button-lineheight-approach {
-webkit-appearance: none;
border-radius: 0;
border-style: solid;
border-width: 0;
cursor: pointer;
font-weight: normal;
line-height: normal;
margin: 0;
position: relative;
text-align: center;
text-decoration: none;
display: inline-block;
font-size: 16px;
background-color: #999;
color: #000;
max-width: 170px;
height: 50px;
line-height: 50px;
padding: 0 2rem 0;
}
</style>
<div>[A Button](#) [A Button that breaks to two lines](#)</div>
This approach is a lot less hands on for the vertical alignment. You set height: 50px;
and line-height: 50px;
and voila, perfect vertical alignment. Until you need two lines and then it bleeds out of the button because you thought a CTA would never be more than 3 words long. At this point you're forced to either increase the button width, or reduce your font-size and neither are very designer friendly.
<style>
.button-flexbox-approach {
display: flex;
justify-content: center;
align-items: center;
-webkit-appearance: none;
border-radius: 0;
border-style: solid;
border-width: 0;
cursor: pointer;
font-weight: normal;
line-height: normal;
margin: 0;
position: relative;
text-align: center;
text-decoration: none;
padding: 1rem 2rem 1.0625rem 2rem;
font-size: 16px;
background-color: #34495e;
color: #fff;
}
.button-flexbox-approach:hover {
color: #fff;
}
.flex-button-container {
display: inline-block;
}
</style>
<div>
<div class="flex-button-container">[A Button](#)</div>
<div class="flex-button-container" style="max-width: 170px;">
[A Button that breaks to two lines](#)
</div>
</div>
The main caveat of this approach is that the button now needs a container. The container doesn't need anything fancy on it, just display: inline-block;
to allow the content to naturally scale, and if you want to restrict how large the button can be, add max-width: x;
Other than that, this approach is pretty bullet-proof from my testing and I like it a lot.
<script type="text/javascript">
var flagValidation;
/* validation for 'phone number' */
function PhoneNumberValidation() {
var phoneNum = document.getElementsByName("Phone")[0].value;
var normalPhonepattern = /^[0-9\s\-\+]{6,14}$/g;
if(!normalPhonepattern.test(phoneNum))
{
flagValidation = false;
document.getElementById("PhoneValidation").innerHTML = "Only numbers, '-' and '+' characters are accepted"
}
else
document.getElementById("PhoneValidation").innerHTML = ""
}
function SubmitDetails(){
flagValidation = true;
PhoneNumberValidation();
return flagValidation;
}
</script>
So what is wrong with this picture? - There's no reason for this to be a script tag on the page, let's make it an external script - Mutation - Basing the validation on mutating the variable to false should not be the responsibility of these functions - The flagValidation variable being globally scoped and mutated/used in several places leaves a lot of places for it to fail when making changes - The functions are doing too much. When looking at it from a functional standpoint, they should just be returning a bool, and a final validate function can follow up. - Repeating code (e.g. document.getElement...
) unnecessarily When you allow your functions to be purely functional, this function...
function PhoneNumberValidation() {
var phoneNum = document.getElementsByName("Phone")[0].value;
var normalPhonepattern = /^[0-9\s\-\+]{6,14}$/g;
if(!normalPhonepattern.test(phoneNum))
{
flagValidation = false;
document.getElementById("PhoneValidation").innerHTML = "Only numbers, '-' and '+' characters are accepted"
}
else
document.getElementById("PhoneValidation").innerHTML = ""
}
Can become...
function isPhoneNumberValid() {
const phoneNumber = document.getElementsByName("Phone")[0].value;
const phoneNumberRegex = /^[0-9\s\-\+]{6,14}$/g;
return phoneNumberRegex.test(phoneNumber);
}
Much prettier, right? Once we've refactored all of those individual functions, the main input validation function looks like this:
function validateFormInputs(event) {
let isFormValid = true;
const phoneNumberFeedback = document.getElementById("PhoneValidation");
if (isPhoneNumberValid()) {
phoneNumberFeedback.innerHTML = '';
} else {
phoneNumberFeedback.innterHTML = "Only numbers, '-' and '+' characters are accepted";
isFormValid = false;
}
if (isFormValid) {
contactForm.removeEventListener('submit', validateFormInputs);
return true;
} else {
event.preventDefault();
}
}
It's cleaner, sure, but I'm still not okay with using and mutating that isFormValid
variable and innerHTML
appearing every other line. Let's take it further. Let's outsource the error message work to a utility function.
function generateErrorMessage(element, message) {
return element.innerHTML = message;
}
// So we use that like this...
if (isPhoneNumberValid()) {
generateErrorMessage(phoneNumberFeedback, '');
} else {
generateErrorMessage(phoneNumberFeedback, 'Cannot be empty');
isFormValid = false;
}
The next step is to stop mutating that validity flag. To do this, I'm going to bundle all the validation methods into an object and then reduce that to return an isFormValid bool.
const fields = {
phoneNumber: {
isFieldValid: function() {
const phoneNumber = document.getElementsByName("Phone")[0].value;
const phoneNumberRegex = /^[0-9\s\-\+]{6,14}$/g;
return phoneNumberRegex.test(phoneNumber);
},
userFeedbackElement: document.getElementById("PhoneValidation"),
errorMessage: "Only numbers, '-' and '+' characters are accepted"
}
};
// Generate an array from the keys of the methods object and reduce
Object.keys(validationMethods).reduce((acc, curr) => {
// do stuff
}, true);
If you're not familiar with Array.reduce
, it will iterate over each item in the array and allow you to process them. The arguments are acc
(accumulative) and curr
(current). The idea is, we're going to execute each function and then show/hide error messages accordingly. The function now looks like this:
function validateFormInputs(event) {
const isFormValid = Object.keys(fields).reduce((acc, curr) => {
const currentField = fields[curr];
if (currentField.isFieldValid()) {
generateErrorMessage(currentField.userFeedbackElement, '');
return acc;
} else {
generateErrorMessage(currentField.userFeedbackElement, currentField.errorMessage);
return false;
}
}, true);
if (isFormValid) {
contactForm.removeEventListener('submit', validateFormInputs);
return true;
} else {
event.preventDefault();
}
}
This implementation is clearly a case-by-case basis. It works for my particular scenario because there's only one validation condition for each field. If there were more rules, the approach would need to be changed to compensate and it may not be able to be as dynamic. It should also be noted that this is a fairly over-engineered solution. I wouldn't say that the original approach is wrong, but my approach looks at the same problem from a functional programming standpoint and I believe it is much cleaner and much more robust. For a view of the entire file, see my gist at https://gist.github.com/3stacks/c5c49904684e4ddec48aa017ab912db9
]]>components
are defined with a name and a selector. For example, ".site-nav" or "body". You define all components in the components array, but then you can cherry pick which ones are used on each page. Such as, homepage may use the hero component, but about may not.
{
"sizes": ["320x480", "1280x768", "1920x1080"],
"pages": [
{
"name": "homepage",
"url": "http://localhost:3000/",
"components": ["hero", "all"]
}
],
"components": [
{
"name": "all",
"selector": "body"
},
{
"name": "hero",
"selector": ".hero"
}
]
}
Since I'm generally against installing npm packages globally (and you probably should be too), I define my capture scripts in package.json
. This presents the first issue: The usage of Argus is like so: argus-eyes capture <branch-name>
But this of course only names the capture for you. It's your responsibility to switch branches. So the workflow becomes:
develop
branchargus-eyes capture develop
(this is the baseline)feature-branch-name
argus-eyes capture feature-branch-name
argus-eyes compare develop feature-branch-name
Argus then uses blink-diff to compare the two sets of screenshots you just captured (note, you shouldn't change your config between captures) and outputs any screenshots in which there are visual differences. For example, bumping the padding on your nav will result in something like this. It's not a super intelligent representation, however, it does quickly show you that something is wrong. In my opinion, the current workflow makes it almost worth not bothering. So how do we make it a 1 step test?
I am attempting to simulate this entire process in node. For this, we'll need a few things.
I've tried to make the node script as pure as possible. I created a file called argus-test.js
. In that, there is an individual function for each git action. First is a function to initialise the repo.
/**
* @param {string} path - path to the repository (.git)
* @returns {Promise}
*/
function openRepository(path) {
return Git.Repository.open(path);
}
// Path is based on current working directory
const repoPath = require("path").resolve("./.git");
openRepository(repoPath).then(...)
openRepository returns a Promise which has the reference to the repository in it. To act on the repository, we need to keep track of this returned value. Since all of the nodegit functions return Promises, we're going to be seeing a lot of then
.
// Initialise this let to keep track of which branch we're on
let featureBranch;
/**
* @param {Repository} repo - The reference to the repository object
* @returns {Promise}
*/
function saveCurrentBranch(repo) {
return repo.getCurrentBranch();
}
openRepository(repoPath).then(
repo => {
saveCurrentBranch(repo).then(repoName => {
featureBranch = repoName;
});
},
err => {
// Usually would only happen if you give it the incorrect path
throw new Error(error);
}
);
Now we have a reference to the current feature branch, we've got that stored for later. In the function where we set the featureBranch variable, we're going to execute our capture functions.
shell.exec(
`node node_modules/argus-eyes/bin/argus-eyes.js capture ${featureBranch}`
);
// Successful output will say something like "12 screenshots saved to .argus-eyes/feature-branch-name"
This is the tricky part. We have to switch branch to whatever the base is (develop in this case). This is the biggest hurdle. Although the function is simple, if there are any uncommitted changes, the function may fail. Probably best to warn the user to make sure all changes are committed or stashed first.
/**
* @param {Repository} repo - The reference to the repository object
* @returns {Promise}
*/
function switchToDevelop(repo) {
return repo.checkoutBranch('develop');
}
switchToDevelop(repo).then(...)
After successfully changing to develop, we still have to capture the branch and then compare them, which is done like so:
shell.exec('node node_modules/argus-eyes/bin/argus-eyes.js capture develop');
shell.exec(
'node node_modules/argus-eyes/bin/argus-eyes.js compare develop ' +
featureBranch
);
If Argus detects any screenshots over the threshold for change, it will save the diff in a folder like .argus-eyes/diff_develop_feature_branch_name
For the full file in action, check out this gist: https://gist.github.com/3stacks/0976ef8a84c50c6096aea09dbbbebd88
To improve this process, it might be an idea to save the baseline diff in the repo and then overwrite it whenever you push to that branch. This would eliminate the need to switch over the branches.
]]>const appState = {
key1: {...},
key2: {...}
}
and set the data like this:
localStorageManager.set('appData', appState);
The issue with this is you may not want key1
and key2
to be grouped together but don't want them to be tossed straight into the local storage. With namespaces you can do this:
localStorageManager.set('key1', key1, 'myAppState');
localStorageManager.set('key2', key2, 'myAppState');
This makes it easier to access all of your data at once while still keeping those keys theoretically separate. When accessing the namespaced data, you simply add the namespace as the second arg like so:
localStorageManager.get('key1', 'myAppState');
The app is now more robust internally and can handle cases of missing data better. It also uses the getItem
and setItem
methods internally instead of accessing the localStorage directly. To get started, install via npm with npm install @lukeboyle/local-storage-manager
See the npm page with documentation and in depth instructions at - https://www.npmjs.com/package/@lukeboyle/local-storage-manager
npm install
and then npm test
. I first ran into the issue in an Angular project that had tests triggered in the prepublish
command. My CI build failed and I decided to remove the prepublish hook and change the name of my test script until I had the time to come back. For months I've been avoiding the issue, but I have finally solved it. The Karma docs suggest that you can run the tests in Firefox with the --browsers flag (see https://karma-runner.github.io/0.8/plus/Travis-CI.html). Travis has since updated so that Chrome can be loaded into the environment. For this to work, you'll need to make changes to your travis.yml
file and your karma config file.
travis.yml
Note that I'm using only latest node as that is the requirement for me
language: node_js
node_js:
\- "node"
before_script:
\- export CHROME_BIN=chromium-browser
\- export DISPLAY=:99.0
\- sh -e /etc/init.d/xvfb start
The before_script is the special part, which points travis in the right direction for running Chrome. The last two lines are addressed in the karma docs linked above. Personally, I am using a separate karma config file, and I want to make the changes within that config file to keep my test script clean. My test script is:
"test": "karma start karma.config.js"
karma.config.js
const configuration = {
files: [{ pattern: 'tests/**/**/**.*', watched: true }],
customLaunchers: {
chromeTravisCi: {
base: 'Chrome',
flags: ['--no-sandbox']
}
},
frameworks: ['mocha'],
browsers: ['Chrome'],
failOnEmptyTestSuite: true,
singleRun: true
};
if (process.env.TRAVIS) {
configuration.browsers = ['chromeTravisCi'];
}
module.exports = function(config) {
config.set(configuration);
};
Luckily, Travis sets the process env to TRAVIS and if we check for this, we set the configuration browsers to ['chromeTravisCi'] which is defined in the customLaunchers. Have whatever pre-processors you need in the configuration object and it should work fine when you deploy.
]]>For my project I'm using Webpack and just default npm scripts. Whatever your choice for build process the important part is what you have configured your babel config or .babelrc with.
plugins: [
'transform-runtime',
'transform-vue-jsx'
],
presets: ['es2015']
That's the basic requirement for getting started. To install those, run:
npm install -D babel-plugin-transform-runtime
npm install -D babel-plugin-transform-vue-jsx babel-helper-vue-jsx-merge-props babel-plugin-syntax-jsx
npm install -D babel-preset-es2015
The official repo for the Vue jsx is located here: https://github.com/vuejs/babel-plugin-transform-vue-jsx The interesting part about VueJsx in my opinion is that it follows the Angular pattern for registering components. Whereas in React you just import a function that returns jsx and you can name it whatever, in Vue jsx you must declare the name and register the component globally. Vue has a component method that takes a name and an object with all relevant data. The difference being is that instead of a template
entry, there's a render
function which returns jsx.
Vue.component('jsx-example', {
render (h) { // <-- h must be in scope
return <div id="foo">bar</div>
}
})
// Usage
<div>
<jsx-example/>
</div>
h
is the shorthand for the Vue instance $createElement method so you have to make sure that h is in the scope of your components, like so:
const pageView = new Vue({
el: '#root',
data: {},
methods: {},
render () {
const h = this.$createElement;
return (
<div>
<jsx-example/>
</div>
)
}
});
From the get go it seems to me like we've lost some of the versatility that jsx provides by having to integrate it into the normal Vue component pattern.
return (
<div
// event listeners are prefixed with on- or nativeOn-
on-click={this.clickHandler}
nativeOn-click={this.nativeClickHandler}
key="key"
ref="ref">
</div>
There's a strange thing where on-change on a form input seems to be naturally debounced, and the nativeOn-change
doesn't seem to be any different. The behaviour doesn't seem to be the same as the React class where you can refer to an element with this.refs
, you need to use this.$refs
which follows the usual Vue convention. Since there's no documentation surrounding the jsx, I'm assuming the rest of the behaviour follows the standard Vue component pattern, but instead of a template, there's a render
function. The jsx doesn't support the normal vue directives so you'll have to do any of those things programmatically.
After much frustration with this issue, I found this section in the react material-ui documentation - React-Tap-Event-Plugin. The custom components like the select field don't work well with the traditional onClick listener, so as a temporary fix, the react-tap-event-plugin must be included in your react project. The dependency is supposedly a temporary fix. See the repo here: https://github.com/zilverline/react-tap-event-plugin
]]>Earl Sweatshirt - Earl
Big Boi - Sir Lucious Left Foot
Nas & Damian Marley - Distant Relatives
The Roots - How I Got Over
Kanye West - My Beautiful Dark Twisted Fantasy
Kanye West & Jay Z - Watch The Throne
Danny Brown - XXX
Kendrick Lamar - Section.80
Drake - Take Care
A$AP Rocky - Live.Love.A$AP
Black Up - Shabazz Palaces
Goblin - Tyler The Creator
Killer Mike - Pl3dge
Death Grips - Exmilitary
LIL UGLY MANE - MISTA THUG ISOLATION
ScHoolboy Q - Habits and Contradictions
GOOD Music - Cruel Summer
Killer Mike - R.A.P Music
Kendrick Lamar - Good Kid m.A.A.d City
Flatbush Zombies - Better off Dead
Earl Sweatshirt - Doris
Childish Gambino - because the internet
Young Fathers - Tape Two
Chance The Rapper - Acid Rap
A$AP Ferg - Trap Lord
Tyler, The Creator - WOLF
Kanye West - Yeezus
Pusha T - My Name Is My Name
Busdriver - Perfect Hair
clipping. - CLPPNG
ScHoolboy Q - Oxymoron
Open Mike Eagle - Dark Comedy
Vince Staples - Hell Can Wait EP
Run The Jewels - Run The Jewels 2
Freddie Gibbs & Madlib - Pinata
Tyler, The Creator - Cherry Bomb
Lupe Fiasco - Tetsuo and Youth
Wale - The Album About Nothing
BADBADNOTGOOD and Ghostface Killa - Sour Soul
Drake - If You're Reading This It's Too Late
Joey B4DA$$ - B4DA$$
Earl Sweatshirt - I Don't Like Shit I Don't Go Outside
Death Grips - The Powers That B
A$AP Rocky - At Long Last A$AP
Dr. Dre - Compton
Jay Rock - 90059
Vince Staples - Summertime '06
Kendrick Lamar - To Pimp A Butterfly
]]>project-root
|--src
| |--index.jsx
|--index.js
|--rollup.config.js (OR)
|--webpack.config.js
|--demo
| |--dist
| |--build files
| |--src
| |--src files
index.jsx
import * as React from 'react';
export default function ReactComponent(props) {
return <div>Job's Done</div>;
}
Also, to play your part in improving our package ecosystem, consider namespacing your package for npm: http://blog.npmjs.org/post/116936804365/solving-npms-hard-problem-naming-packages
]]>Disclaimer: Shopify is not good. I recommend steering clear and opting for one of many alternatives. It's an extremely closed platform that doesn't encourage innovation and naturally leans towards bad practice. Given this, if you still have to use it, read on.
In Shopify, there is a native (albeit 'unsupported') filtering system. Native Filtering is based on the tags you specify on your product. If you go to your collection, you can link the user to a tag and Shopify can filter product with simple Javascript like so; collections/collection-name/tag-one/tag-two. Now given that in a collection you have access to collection.all_vendors and all_types, WHY OH WHY, is there not native filtering based on that. Filtering could EASILY be dynamic if Shopify cared enough to implement that. The 'official' solution (as per the documentation; https://help.shopify.com/themes/customization/collections/filtering-a-collection-with-multiple-tag-drop-down) is to make several drop downs and set tags to be a list of tags you want to allow filtering by (e.g. tags = "red", "blue", "green"). So next week when I add a yellow shirt I have to go back into the pits and add another tag? Not happening. This is how I make filters dynamic. After searching for hours, I can conclusively say that there is no open source solution for this, and given the constraints of the garbage liquid templating engine, I can confidently say that this is the least convoluted solution available. All it takes is implementing a rigid structure in your tagging system, so this is much easier on a new store. The tag structure is basically as such: category:tagName. Let's say you want to filter your products by brand. In your product page, on the tags section, enter brand:brandName. Same goes for size:1
or color:blue
. It's up to you how many you use, because I guarantee your collection sorting template is going to be a BIG file. The best part about all this is that there's no array filter or equivalent method in liquid, so we're going to have to do some crazy shit.
{% for tag in collection.all_tags %} <-- Start iterating over all tags
{% if tag contains 'style' %} <-- Check if it contains your keyword
{% capture raw\_style\_tags %} <-- Initialise the variable \`raw\_style\_tags\`
{{ raw\_style\_tags | append : tag | append: ', ' }} <-- Build a string of tags separated by commas
{% endcapture %}
{% assign style\_tags = raw\_style_tags | split: ', ' %} <-- Split the strings on the commas to build a new array
{% endif %}
{% endfor %}
The variable style_tags
is now an array of all tags including 'style:'. Now, you will make a select field where the options are all of your style tags. Note that current_tags returns a list of the tags you are currently filtering by.
Shop by style
All
{% for t in style_tags %}
{% assign tag = t | strip %}
{% if current_tags contains tag %} <-- check if the tag is currently active - applies selected attribute
{{ tag | remove: 'style:' }}
{% elsif product_tags contains tag %} <-- else, just make it an option
{{ tag | remove: 'style:' }} <\-\- use the remove filter to have just the tag name
{% endif %}
{% endfor %}
If you include the Javascript from the Shopify docs, it will automatically
listen for changes to that .coll-filter. This way, if you ever add any more
tags under the style:
category, you won't have to update your view. And the
best part is, you can just add a new category in your product page, copy paste
those lines of code and change 'style' to whatever your new category is called.
I must reiterate, you should only use Shopify if you have no other choice. Cheers!
Outwardly, the changes are minimal. The most obvious change is that the add module dialogue is now a modal instead of a floating column element. Various styles have been optimised and reduced as much as possible so the button sizes specifically are more consistent across browsers.
Around three quarters of the way through version 1 it became apparent that the app was outgrowing the constraints of the Vue system I had created, so the app has been rebuilt in React.js and Redux. The standard module model Using this model, every module has a content object and an event object under it. The content object handles calendar events, Asana workspaces and so on. Adhering to this model will allow for rapid development of new modules in future. Events The event system is simulated using the Redux middleware called Thunk. The base dispatch will set the event to executing and it will continue to execute until it is told to stop. If error is true, the event stops executing and and the error response is populated in the response key. Error false means the event resolved correctly and the response is the delicious events or tasks. React also makes rendering the correct component a breeze. I know to hide all content if the user hasn't authorised, and if the event is executing. Error messages are nice and simple too. https://youtu.be/T43RzjxwBys Next Steps Agander is being temporarily put on hold to focus on other projects - but in its current state it is very much usable. Aside from bug fixes, there will be no new features for at least a couple months while I'm working on other things. I'm really happy with how far the app has come and I can finally use it for my own agenda tracking.
]]>Color ID | Color Name | Hex Code |
---|---|---|
undefined | Who knows | #039be5 |
1 | Lavender | #7986cb |
2 | Sage | #33b679 |
3 | Grape | #8e24aa |
4 | Flamingo | #e67c73 |
5 | Banana | #f6c026 |
6 | Tangerine | #f5511d |
7 | Peacock | #039be5 |
8 | Graphite | #616161 |
9 | Blueberry | #3f51b5 |
10 | Basil | #0b8043 |
11 | Tomato | #d60000 |
I switched to Apple Music (this is not an endorsement, there's plenty wrong with Apple music too) The reason I chose apple is because
Authorising the user and displaying their tasks is reasonably easy following the quickstart guide here. Essentially, requests are separated into two categories; either tasks
or tasklists
. When you have loaded the tasks api, you can see the basic structure and work from there. API Reference for JS To find the tasklists, you would use the list function (returns an array of tasklist objects).
function listTaskLists(gAPI) {
var request = gAPI.client.tasks.tasklists.list({
'maxResults': 10
});
request.execute();
}
Finding tasks in a given task list operates much the same way, however, you are dealing with Google here, so it's tasks.tasks.list... Basic parameters here would just be the tasklist you want to pull tasks from, however, there are other options.
function getTasksByListId(gAPI, tasklistId) {
var request = gAPI.client.tasks.tasks.list({
'tasklist': tasklistId
})
request.execute();
So, we've covered getting the tasks, how do we manipulate it? That's where the tricky part comes in. The gapi
client interactions we used before have an update
method. However. Whenever I called update on anything, I got a 400 error with 'Invalid Value'. This is a common issue I've observed online with no real solutions. The gist of it is, that there is a bunch of 'required parameters' for you to include in the request, but there is absolutely no documentation on this (thanks Google). To get around this, we found that it was simply easier to outright request it using the request method and giving it a url. The path parameter requires a tasklist Id, and a task id. This is basically the url that comes down with the getTasksByListId request. Make sure you define the method as PUT, and you pass the whole task object with your updated values to Google. In this instance, we are marking the task as 'completed' and giving it a completed timestamp.
function markTaskComplete(gAPI, task) {
gAPI.client.request({
path: 'https://www.googleapis.com/tasks/v1/lists/' + tasklistId + '/tasks/' + task.id,
method: 'PUT',
body: Object.assign(
{},
task.originalTask,
{
completed: new Date().toISOString(),
status: 'completed'
}
)
}).execute();
}
Now you have a basis, the world is your oyster.
]]>