{ "version": "https://jsonfeed.org/version/1", "title": "Boyleing Point", "home_page_url": "https://boyleingpoint.com", "feed_url": "https://boyleingpoint.com/rss/feed.json", "description": "Psychotic ramblings about technology & culture", "icon": "https://boyleingpoint.com/android-icon-192x192.png", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" }, "items": [ { "id": "https://boyleingpoint.com/blog/posts/our-god-the-painter", "content_html": "
\nWhen I look at nature I see a glimpse of the infinite creativity of our Father. Not only did He create every species, down to the seemingly endless variations of trees, brush, mammals, and insects, but He has seen every distinct movement of every creeping thing, down to the blood pulsing through their veins and the fur on their backs swaying in the wind. He has seen the way every single photon cast off from the sun makes contact with the trees, diffusing through their leaves, and touching down on the ground for all of history ad infinitum.
\n\nAs I gaze at the sky, He gazes back. He is the light that makes contact with the photoreceptor cells in my eyes to reveal the majesty of creation. And every sunset that splashes across the sky, He has painted thoughtfully on an ever-shifting canvas. Yet to me, every day, in its own way, the sun sets more beautifully than I could possibly imagine. And in my limitation, every day is a mystery.
\n\n", "url": "https://boyleingpoint.com/blog/posts/our-god-the-painter", "title": "Our God, the painter", "summary": "In celebrating the beauty of creation, I see God, and I celebrate my limitations.", "date_modified": "2023-08-14T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/in-search-of-humanity", "content_html": "Having regrettably fallen prey to the allure of dating apps again, I am filled with a renewed sense of sorrow for the sheer number of people who, rather than seeming to express any genuine individual personality, appear to have been stamped out on an assembly-line and ejected from a factory. The most pervasive (although perhaps the most understandable) of these living tropes is the "traveller" persona. For example the prompt on Hinge "What I'm looking for" is more often than not answered with "Someone to travel with". I'm not against travelling, but for large swathes of people in our wealthy, decadent society, travelling, instead of a means of pursuing enriching experience, has become an end in itself. This phenomenon is epitomised by the Contiki tour or in "Cruising".
\nFor modern man, travelling to another country typically offers no transcendent experience beyond drinking, eating, and taking the same photo of the Leaning Tower of Pisa that tens of millions of people have taken before you. This ritual is no different to me than if you just went to a local district themed after your culture of choice, and you ate at a restaurant where they spoke their native tongue to you, and then you step outside and stick your head through a photo stand-in and check off the photo in front of your attraction of choice. The only real difference is some abstract notion that you were in the correct geographic location, therefore you may now tick the country off your list. The article The age of average describes a creeping void of homogeneity using Airbnb and cafe interior design (see fig. 1 below) to illustrate how, all over the world, interior designers have unconsciously agreed upon a globally homogenous style-guide which affords well-to-do individuals the ability to travel to the other side of the world and see nothing new. Sure, you may go to a busy street market in Phnom Penh and see all sorts of people living drastically different lives to you, but at the end of the day you'll retire to a luxurious villa and forget all about the poverty surrounding you.
\n\nFigure 1: The age of average
\nEvola's Meditations on the Peaks captures this perfectly in the context of mountain climbing which had, in his view, been corrupted and trivialised as simply another vain pursuit of hedonism. "we cannot help but notice the presence among our young people, of love for risk and even of heroism. [...] mountain climbing, when experienced only in keeping with this view, would not be easily distinguished from the pursuit of emotions for their own sake". Evola continues, "This pursuit of radical sensations generates all kinds of extravagant and desperate feats and bold acrobatic activities [...] All things considered, these things do not differ very much from other excitements or drugs, the employment of which suggests the absence rather than the presence of a true sense of personality". To Evola, the spiritual majesty of the mountain from days of antiquity arose from their inaccessibility. Virtually all ancient civilisations situated around mountains saw the mountains as possessing some essence of immortality, as they conceived of the mountain as a separate plane of existence.
\nTechnological advancements that make it easier to summit a mountain cheapen the experience. Now today, when virtually every mountain has been conquered, and we have helicopters and drone footage of the peaks, the mystery has been completely devoured by the machines of modernity. Consider the case of Tabletop Mountain in Toowoomba. In 2017 there was a proposal to build a cable car across from Picnic Point to the mountain. Tabletop Mountain is a tough but manageable climb for your average able-bodied person, but you still have to make a physical commitment to reach the top. Today, you can climb tabletop and find yourself completely alone, slightly closer to God, and able to look across the rolling hills at an ever-widening horizon. By making it accessible to everyone, you inevitably destroy what little mystique remains.
\nThis brings us back to the "traveller" persona. I cannot blame, nor judge these people. After all, what spiritually transcendent activities can your average man really engage in today? The potential for enrichment has been sucked out of virtually every activity, and we are told there is no spiritual aspects to pursue within ourselves. You shouldn't have children because it's unaffordable, or because it's bad for the environment. You shouldn't pursue God, because that's for un-developed Neanderthals who haven't yet heard the gospel of Science. The only option presented to these people to achieve fulfilment is travelling, and it makes sense, because there is still a notion of triumph in travelling. You cross vast oceans in a matter of hours, whereas your ancestors - if they were even able to travel - would be crammed into the hull of a ship for weeks or months just to see one new country. Unfortunately for these people, when they make it overseas, they'll typically find many of the same trappings as they are accustomed to at home (see below for an image of an "English" street vandalised with American fast-food chains). In my own life, the last overseas travel I did was to Cambodia in 2018, and after I went to the Killing Fields and heard the harrowing tale, I returned to my hotel and on the same street was a Burger King and Cold Stone Creamery.
\n\nFigure 2: An "English" high street
\nThe end result of all of this is the complete commoditisation of spirituality; a sort of drive-thru baptism where people are told that enrichment of the soul is yet another product to be consumed, rather than a lifelong pursuit within yourself. People are sold the idea that to be enriched, all you need to do is buy a ticket to see the sunrise at Angkor Wat and you'll be whole. Given enough time, I believe that even this level of spiritual enrichment is going to be made impossible for the average man (due to climate restrictions and the theft of their discretionary income by central banks) and travel may again be reserved for the social elite. Perhaps then our people will once again begin to look within.
\n\n\n", "url": "https://boyleingpoint.com/blog/posts/in-search-of-humanity", "title": "In search of humanity", "summary": "When you look at the world today, is it really surprising that people are more aimless and depressed than in all of modern history? Though our material conditions are still leagues ahead of our forebears, we have been completely deracinated and given nothing to aspire to.", "date_modified": "2023-04-02T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/i-must-humble-myself", "content_html": "Those who are irresistibly attracted to the mountains have often only experienced in an emotion a greatness that is beyond their understanding. They have not learned to master a new inner state emerging from the deepest recesses of their beings. Thus, they do not know why they seek increasingly wider horizons, freer skies, tougher peaks; or why, from peak to peak, from wall to wall, and from danger to danger, through their experiences they have become inexplicably disillusioned with everything that, in their ordinary lives, appeared to them as most lively, important, and exciting
\n
\n\nTwo men went up to the temple to pray, one a Pharisee and the other a tax collector. The Pharisee stood by himself and prayed: ‘God, I thank you that I am not like other people—robbers, evildoers, adulterers—or even like this tax collector. I fast twice a week and give a tenth of all I get.’ But the tax collector stood at a distance. He would not even look up to heaven, but beat his breast and said, ‘God, have mercy on me, a sinner.’ I tell you that this man, rather than the other, went home justified before God. For all those who exalt themselves will be humbled, and those who humble themselves will be exalted.
\n
Luke 18:9-14
\nYou must humble yourself. For all those who exalt themselves will be humbled, but the one who humbles himself will be exalted. Death is the ultimate sign of spiritual rebirth, and to die in the name of God would surely be the greatest honour a Christian can attain.
\nFor my money, the crucifixion is the greatest form of humbling one could experience. Not only are you brutally executed and displayed as a warning to would-be "law"breakers, those who purport to be your brothers and sisters will rally around in Lemming-like obedience and celebrate your execution. After all, your fate has been determined by our god, the State.
\nFor the Son of God to willingly humble himself to this degree, in the face of utter betrayal from his people, in the face of false accusations, who are you to have any pride in your insufficient self or your own abilities? If the time ever comes that we should choose our faith or death, Lord please give me the strength to choose You in the face of death. Father, into your hands I commit my spirit.
\nAmen.
\n", "url": "https://boyleingpoint.com/blog/posts/i-must-humble-myself", "title": "I must humble myself", "summary": "A prayer for humility, and a celebration of my insufficiency.", "date_modified": "2023-03-24T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/sweden-a-case-study-in-herd-immunity", "content_html": "You may note that you have scarcely heard a peep about Sweden in the last 6-10 months. I would contend that the reason for this is the overwhelming success of their no-lockdown, no-mask-mandate policies. In 2020, you may have seen articles such as TIME Magazines The Swedish COVID-19 Response Is a Disaster. It Shouldn’t Be a Model for the Rest of the World. Nothing like a bit of fear porn to get the cortisol pumping and make the serfs beg for more government "protection".
\nIn Figure 1 below, we can see Sweden's annual death count. The average deaths between 2011 and 2019 was 90,675 and the total deaths in 2020 was 97,941. This represents an 8.01% increase in total deaths, of which 9,771 (or 9.98% of total deaths). Assuming those who died from COVID-19 wouldn't have died from other causes* Sweden would have had 2.76% fewer deaths than the average. This could be explained by the voluntary behavioural changes that people make in the absence of government dictates. Sweden now
\n* which is an unlikely scenario given that COVID-19 rarely kills anyone in the absence of serious co-morbidities. In fact, the NHS reports that out of 130,624 deaths, only 3,656 deaths (2.80% of deaths) occurred where a pre-existing condition was not recorded (see appendix 1 for source).
\nFigure 1: Sweden death rate\n
\nSweden now has a greater than 50% vaccination rate and they have experienced 28 deaths between July and mid-August.
\nAs I mentioned in my post On mandatory COVID-19 vaccinations, the COVID-19 deaths are only one factor when it comes to measuring the impact of the pandemic (and the government response).
\nSweden's inflation rate peaked in April 2021 at 2.2% YoY (Figure 2) at which point they reversed the trend and reported 1.3% YoY inflation in Jun 2021. By contrast, the United State's has "peaked" in Jun 2021 at a massive 4.5% YoY but we can't say this has peaked since they are still trending upwards. This can be partly explained by the US policy of closing businesses and paying workers to stay home with their enhanced unemployment benefits. I couldn't find any statistics on Sweden's business closures, but in the United States, Yelp data showed that 60% of business closures became permanent in 2020 and they closed out the year with 200,000 extra permanent business closures
\nFigure 2: Sweden inflation rate\n
\nFigure 3: United States inflation rate \n
\nAppendix 1: NHS Freedom of Information request regarding COVID deaths\n
\n", "url": "https://boyleingpoint.com/blog/posts/sweden-a-case-study-in-herd-immunity", "title": "Sweden: a case study in herd immunity", "summary": "Exploring the success of Sweden's no-lockdown policy ", "date_modified": "2021-08-09T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/on-mandatory-covid-vaccinations", "content_html": "There have been a few pandemic scares in my time, the most far reaching of those being the Swine Flu (H1N1) outbreak in 2009. Swine flu infected approximately 60.8 million Americans and the CDC found that 87% of deaths occurred in the under 65 age group with a death rate of roughly 0.02%. By comparison, the influenza virus (according to regular flu season data) has a death rate of 0.13% (roughly 7 times more deadly than H1N1). Public health officials didn't react strongly to stop the spread of H1N1 and they were vindicated in their response by sheer luck. Had the virus been more deadly or more transmissible their lack of reaction could have caused hundreds of thousands or millions of deaths.
\nWith COVID-19, early signals from China were very worrying indeed. By January 2020 we saw footage of people collapsing in the streets, people being barricaded in their houses, dragged out of their cars at checkpoints, and trucks patrolling the streets spraying chemicals in the air. This overt propaganda (that included sock puppet accounts spreading pro-lockdown messaging on social media) was after the covert attempt to suppress information about COVID by imprisoning doctors and falsely claiming that human to human transmission was impossible.
\nI've heard many theories about the motivation behind this disinformation campaign, the most compelling of which being that China knew how serious this virus would be and they wanted to allow it to spread to the rest of the world so they didn't suffer alone. My theory is much simpler and it's based on a pattern we've seen many times in history. The nature of pathological totalitarian governments such as China or the USSR means very little tolerance for mistakes. As we saw in 1986 after the meltdown of the nuclear reactor in Chernobyl, the authorities engaged in a multi-year coverup to deflect blame. Similarly, during the H1N1 outbreak, local officials suppressed information about the outbreak and didn't warn the public, fearing reprisal from Beijing.
\nSo, how deadly is SARS-Cov-2 as compared to influenza? According to the Johns Hopkins COVID dashboard the global death rate (197,872,410 cases; 4,217,383 deaths) is 2.13%. The death rate fluctuates considerably between countries with the USA having a rate of 1.73%, France at 1.81%, and Australia at 2.69%. Strangely, India (where we saw many reports of people dying in the streets when the delta variant started spreading) has a far lower death rate (1.34%) than other comparably richer countries. The death rate in the USA is 1.73%, or 13.3 times higher than the influenza death rate, so, clearly this is a serious illness and we should not be flippant about it, but the question remains: have we overreacted?
\nThe conversation about "15 days to slow the spread" started in mid-March 2020 in Australia and I agreed with the approach at the time. We didn't know how deadly the virus would be and needed to gather data to steer public health policy. For those of you with stone slabs for ceilings, the purpose of the 15 Days Doctrine (also; "flatten the curve") was to stop the initial wave of infections from overwhelming hospitals. Notably, the purpose was not to stop infections entirely. With 1.5 years of hindsight, we now know that (in the United States), 95.2% of deaths are in the 50-up age group and 79.2% of deaths are in the 65 and up age group (see figure 1 below). Deaths in the 30-39 age group represent 0.2% of cases which drops off dramatically for the 18-29 range (at 0.007% of cases) and when it comes to school-aged children (0-17 years old) the percentage of overall cases is 0.001% or 1 in 1000. The people least affected by this illness have paid the highest price as a result of the hysteria. They have to shoulder the mental burden of isolation during their most important developmental years, potentially being locked up with abusive parents (when school may have been their only respite from physical or verbal abuse), and online learning programs from a school system that is already a complete failure when it comes to academic outcomes. They will also have to shoulder the financial burden of the money printing required to prop up the stock market and pay off employees whose businesses were shut down by lockdown measures.
\nFigure 1. Deaths in the United States by age group as a proportion of deaths and of overall cases\nData: cdc
\nAge group | \nNumber of deaths | \n% of deaths | \n% of cases | \n
---|---|---|---|
0-17 | \n340 | \n0.06% | \n0.001% | \n
18-29 | \n2,493 | \n0.41% | \n0.007% | \n
30-39 | \n7,145 | \n1.18% | \n0.020% | \n
40-49 | \n14,976 | \n3.14% | \n0.054% | \n
50-64 | \n96,318 | \n15.96% | \n0.275% | \n
65-74 | \n134,601 | \n22.30% | \n0.385% | \n
75-84 | \n165,059 | \n27.35% | \n0.472% | \n
85-up | \n178,572 | \n29.59% | \n0.511% | \n
As I noted earlier, 87% of H1N1 deaths occurred in the under 65 age group which is an almost perfect inverse of the pattern in COVID deaths. Knowing this, we could have designed a program to minimize the impact on the young while protecting the elderly and severely sick, but we didn't. It's like having a fire on your stove, and instead of reaching for a fire extinguisher you get in a plane and dump 12,000 liters of water on your house from overhead. With a common sense policy allowing freedom for those who are not at risk, the cost/benefit analysis is left up to individuals armed with the best information. If you are not in an at-risk group but you don't feel safe then you are perfectly welcome to wear masks, work from home, order all your groceries online, etc.
\nThe fundamental mistake that people make when advocating for universal lockdowns is a lack of consideration for the unseen consequences. This idea is best illustrated by Frederic Bastiat in his essay "That which is seen, and that which is not seen", where he lays out the Broken Window Fallacy. If someone throws a rock through a shopkeeper's window, people will often attempt to placate the shopkeeper by saying "at least you're giving work to a glazier who will use that to buy goods and enrich the community", however, what they fail to recognize is that the money used to replace the window could have been used to buy a new shirt. Whereas without the broken window the shopkeeper would have had a window and a new shirt, now he only has a window and the community is deprived of a shirt it could have afforded otherwise. According to a report by Douglas Allen, the lockdowns in Canada may have saved a cumulative 22,333 years of life as a result of the reduced COVID infections. The other side of the coin is the up to 6,300,000 years of lost life from:
\nAs Douglas Allen says "the cost/benefit ratio of lockdowns in Canada, in terms of life-years\nsaved, is between 3.6–282. That is, it is possible that lockdown will go down as\none of the greatest peacetime policy failures in Canada’s history".
\nIndeed, in October 2020, more Japanese people committed suicide in that month alone than in the 10 months of COVID infections until that point. A survey by Grossman et al found that 60% of respondents reported increased drinking due to stress and boredom, and in Australia a survey revealed that nearly 10% of women have experienced domestic violence during the pandemic with 50% saying the abuse had become more frequent or severe since the pandemic began.
\nAs if lockdowns weren't contentious enough, we turn to the topic of vaccines. My typical conversation about COVID vaccines follows this script:
\n"Are you going to get the vaccine?"
\n"No, I don't think so, I'm not at risk"
\n"Oh my God, you anti-vaxxer conspiracy theorist. You're killing grandma!"
\nThe false dichotomy has been set up such that you can only be pro-COVID-vaccine or anti-all-vaccines. I won't virtue signal about vaccine support since that's become the new "I can't be racist, I have black friends", but I will say I'm happy to not be living in an iron lung. I am neither anti-vax, nor anti-COVID-vaccine. I'm pro-COVID-vaccine for those who are at risk and those who voluntarily choose to get it.
\nA typical talking point of the fervently pro-COVID-vaxxers is that they are doing it out of a sense of duty to the collective to reach herd immunity. This notion falsely assumes that everyone being vaccinated necessarily means the pandemic is over. A recent CDC study found that 74% of infections in a Massachusetts outbreak were among the fully vaccinated. Are you surprised? You shouldn't be - breakthrough infections were expected per the clinical data in trials. The number of breakthrough infections is almost certainly being understated since the bulk of these breakthrough infections are asymptomatic, and therefore, rarely cause alarm for individuals to get tested.
\nIn Israel (which has one of the highest vaccination rates in the world) half of the cases in a recent outbreak were among the fully vaccinated, and researchers estimate that the Pfizer vaccine is now just 39% effective in preventing infections which has spurred talks about a third booster shot. This could be due to the spread of new variants (due to the narrow scope of the current generation of COVID vaccines) or due to a reduction in antibody levels among the vaccinated.
\nSo, it has a marginal effect in preventing infections, but it's actually quite good at preventing severe symptoms. Armed with the data, we can conclude that an ideal goal would be "hybrid immunity" where at-risk individuals get the vaccine and the remainder of people are allowed to decide whether to be vaccinated based on their individual risk appetite. This would allow us to begin resuming normal life with no risk of overwhelming hospitals and with few deaths. Consider Figure 3 (below) which charts the death and hospitalization risk against age (with and without comorbidities) amongst the unvaccinated. As a 30 year old with no comorbid conditions, your risk of hospitalization (i.e. very severe symptoms) is a mere 2.7%. When you include those with asthma the percentage rises to 5.1%. Including obesity, the number climbs to 12.9%. Indeed, apart from age, obesity is the single biggest risk factor for COVID-19 (which makes the public health policy decision to lock people inside their homes and close gyms all the more absurd). In fact, lung cancer patients have better odds of survival than the obese.
\nFigure 3. Risk of hospitalization and death by age
\n\n\n\nThe singular driving force of all living creatures (and thus, their evolution) is this: "Accruing the most resources with the least effort".
\n
Pfizer made US$7.8bn in Q2 of 2021 alone and is forecasting sales of US$33.5bn for 2021. In Figure 4 below you'll note that for the US market, Pfizer makes $19.5 per vaccine. If the US government was to mandate vaccines for everyone and they placed their orders exclusively with Pfizer, that would mean approximately $13bn in the US market alone. You don't have to be an economics expert to see that the incentives are aligned to get as many people vaccinated as possible, and given the growing calls for booster shots, the gravy train keeps rolling for these pharmaceutical companies. It doesn't matter to the pharmaceutical companies whether people need the vaccine or not, their singular focus is on generating profits based on vaccines sold. This doesn't make Pfizer immoral (perhaps amoral), - they're simply responding to the incentives provided by the government.
\nFigure 4. Cost per shot for COVID vaccines\n
\nAs a self-professed capitalist, I have no issue with companies making money, including pharmaceutical companies. The cost of the regulatory burden for developing new drugs, and the fact that close to 90% of all drugs are rejected by the FDA, the companies have to recoup costs not just for the development of the drugs that make it to market, but also the drugs that don't. When profits arise in a free market they are the result of voluntary interactions between the company and the individual. The individual decides that the product is worth more to them than their money. The nature of mandatory vaccination means that the profits are derived not from voluntary interactions, but by coercion at gunpoint. I've heard several people falsely claim that the vaccine is free, and to those people I say: there ain't no such thing as a free lunch. The vaccine is being paid for one of two ways: taxation or inflation.
\nIn a free market the decision making would be completely decentralized. Health insurance companies would analyze the risk to them for each individual to get COVID (accounting for lifestyle factors such as obesity, cigarette smoking, etc.) to determine the cost to them with and without the vaccinations. They would also look at the clinical data for each vaccination and determine the risk of long-term side-effects as a result of vaccinations, and if they determine that it would be cheaper/less risky for them if people were vaccinated, they would add a premium loading for those who refuse to get the vaccination. If you are at risk and you refuse the vaccine, then you may have to pay 20-50% higher premiums because the insurance companies know you're more likely to need expensive medical care as a result of infections.
\n\n\nTrust the science Galileo, we all know the universe revolves around the Earth.
\n
A cursory examination of the history of philosophy and science tells a dramatic tale where no good deed goes unpunished and many of the things which we now take for granted are paid for by the blood of those who refuse to conform to consensus. Socrates was sentenced to death for "corrupting the youth" and failing to acknowledge the gods of the city. Similarly, at the time of his death, Galileo was completely blind after 9 years of house arrest for the heretical pronouncement that the Earth revolves around the sun... Ironically, in both of these examples the offending party was persecuted for going against the religious narrative. Today, in our very secular, (allegedly) rational society, a new religion has been born. You run counter to the prevailing pop-sci narrative at your own peril as The Templar of the "I Fucking Love Science" Facebook page will shout you down and ostracize you from society. In the words of the immortal Anthony Fauci, the great prophet of the cult of Scientism, "attacks on me are attacks on science".
\nFrom a pragmatic standpoint, the percentage of the public who have received COVID vaccinations is entirely arbitrary. The only thing that matters is; are the vulnerable vaccinated? If you are 70 years old, you should probably get vaccinated. If you are a 30 year old with no comorbidities you don't need to get vaccinated, and indeed you probably shouldn't, but ultimately it is YOUR choice.
\nFor the people on the political left who used to be skeptical of government overreach, the inherent self-contradiction of advocating for mandatory vaccines is particularly offensive. If you concede that there are things the government can force you to do (at gunpoint) for your health, then you have to concede that there are things that the government can force you NOT to do (at gunpoint) for your health. Like have an abortion, or get gender reassignment surgery, for example. By giving up bodily autonomy you open the door for all forms of medical tyranny.
\nThe statist doctrine of mandatory vaccination can only possibly lead to the fractionation of society into castes where rights are deprived from those who don't kowtow to the dictates of the state. This obviously violates the 800 year history of Man's natural rights enshrined in law since the Magna Carta was written. In every medical procedure, the doctor is required to explain the risks, and the procedure only goes forward if the individual (or their kin) gives informed consent. If a treatment will fix your eyes but you'll lose a leg in the process we all agree that it's vital for the patient to understand the implications of the treatment, and decide that the consequences of not getting the treatment are greater than the consequences of getting the treatment. You cannot express informed consent for a procedure which you're being forced to undergo any more than you can call kidnapping "marriage", or rape "lovemaking". You cannot take what can only be given voluntarily.
\nLuke Boyle BAppSc, major in microbiology, since credentials are required to have an opinion now.
\n", "url": "https://boyleingpoint.com/blog/posts/on-mandatory-covid-vaccinations", "title": "On mandatory COVID-19 vaccinations", "summary": "The deadly synthesis of science and politics is nothing new, however, the combination of state enforcement and private profits creates a deadly new concoction unlike any other in modern history.", "date_modified": "2021-08-01T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/mencken-conservatism-book-review", "content_html": "Mencken's Conservatism (2012), was written by Benjamin Marks, editor-in-chief for http://economics.org.au/.\nLike most Australian authors, Marks doesn't get enough attention, so please, support the author.
\nThis primer on Mencken's philosophy was quite profound, and, sadly, very underappreciated. I feel that this is an\nimportant perspective to re-frame the debate around statism versus a free society. The author shows that Mencken is\nnot really a cynic, but a realist. As Mencken said, "Reconciling ourselves to the incurable swinishness of government,\nand to the inevitable stupidity and roguery of its agents, we discover that both stupidity and roguery are bearable -\nnay, that there is in them a certain assurance against something worse."
\nIndeed, his writings didn't bring about a free society - in fact, he correctly predicted that government would continue\nto grow at an exponential rate after his death. Advocating for the abolition of the state (or even the greater utopian\nvision of a limited state) is like trying to steer a cruise ship with an oar. So, how did Mencken work for a lifetime\nand still carry on with relative happiness? He didn't write to persuade. The author notes, "Writing to persuade can\nleave you with many peculiar stances. But writing to express your libertarian beliefs is a much more straightforward\nenterprise, and your writing is then relevant forever and won't come back to haunt you".
\nThis makes me think of the modern Conservative whose current platform generally resembles\nthe progressive platform of yesterday. It's an eternal game of rugby where the progressives charge ahead, and the\nconservatives celebrate a successful tackle without noticing they've ceded ground.\nWhen the progressives say, "we want $3 trillion in equitable infrastructure spending", if your response is to say \n"Let's compromise. How about $1.5 trillion?", you have already lost the debate. You tacitly\nadmit that some government spending is good. If some government spending is good, then you obviously\ncan't have too much of a good thing, so why stop at $1.5tn?. You are attempting to persuade the progressive to your\nposition that government spending is evil by agreeing to government spending. Instead, you should argue from the\nprinciple that all government spending is necessarily funded by theft at gunpoint and therefore any concession is\nunconscionable.
\nThe book has shown me that I have been far too utopian in discussions about free societies. Rather than listing all\nthe ways that a free society will be better for the individuals within it - given that this is entirely subjective,\n(and many people find a great deal of comfort in being subordinate to the coercive monopoly of the state) - it is far\nmore productive to argue from first principles. You may not be liked, but you will be authentic, and that is far more\nimportant in the long term. No amount of concession from you will make a free society any more likely. You'll either be\nhated for adhering to your principles, or you'll be forgotten because you abandoned them.
\n\n\n", "url": "https://boyleingpoint.com/blog/posts/mencken-conservatism-book-review", "title": "Book review: Mencken's Conservatism by Benjamin Marks", "date_modified": "2021-07-21T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/how-an-economy-grows-why-it-crashes-review", "content_html": ""The fraud of democracy, I contend, is more amusing than any other... All its axioms resolve themselves into thundering paradoxes, many amounting to downright contradictions in terms. The mob is competent to rule the rest of us - but it must be rigorously policed itself. There is a government, not of men, but of laws - but men are set upon benches to decide finally what the law is and may be [...] I confess, for my part, that it greatly delights me. I enjoy democracy immensely. It is incomparably idiotic, and hence incomparably amusing."
\nH. L. Mencken
\n
How an Economy Grows and Why it Crashes (henceforth known as "the book") is written by Peter and Andrew Schiff. The\nbook is an allegory for the economy based on a fantasy island where every man needs one fish a day to be satisfied.\nThe book outlines the ways the inhabitants of the island increase their productive capacity using capital investment and\nunder-consumption in a very approachable and easy to understand manner. Peter and Andrew Schiff borrowed the central\nallegory from a book written by their father Irwin Schiff (revered tax protestor who wrote one of only two books to be\nbanned in America called The Federal Mafia), entitled How an Economy Grows and Why it Doesn't. Having not\nread the original, I have to assume that the primary differentiating factor between the books is that the adaptation\nincludes an explanation for the cause and the aftermath of the 2008 housing crisis.
\nThe book is very entertaining, and it's a very easy read (took me probably 5 hours, and I'm a particularly\nslow reader). I went through with a highlighter and emphasised the key points, but I found that as I got to the middle\nof the book, the insights started to dry up. As I got to Chapter 8 (A rebublic is born), it started to drag. I suspect -\nthough I may be wrong - that this is approximately where the original allegory of Able, Baker, and Charlie growing the\neconomy ended, and where the younger Schiff's original portion began. It was still entertaining, but unlike the start which\nwas packed with easy to understand explanations for economic principles, the middle part leading up to the housing crisis\nwas mostly a rushed re-telling of history with a healthy helping of fish puns.
\nWhen the authors got past the historical portion and into the future, the book did read better, and it ended very strongly. As this book\ncame out in 2010, the authors envisioned a future where America had to face the music, and Obama took responsibility\nfor his economic policy blunders. Unfortunately, with hindsight, we know that sort of happy ending is rare in politics.\nObama and the Fed's policies became worse, which Trump then inherited, and continued. Ten years after publish, the\ncrash described in the book hasn't arrived as the authors expected, but given the state of the American economy it seems\nmore likely by the year.
\nHere's some key takeaways from the book:
\n(After increasing the productive capacity of the island; that is, catching more fish) "This didn't happen because the\nthree guys were unsatisfied with their limited lifestyle. Their hunger, which is labeled "demand" in economic terms, was\nnecessary to spur economic growth but not sufficient to achieve it."
\n"With their extra fish, the islanders can finally eat more than one fish per day. But the economy didn't grow because\nthey consumed more. They consumed more because the economy grew."
\n(About Able giving a loan to someone to take a holiday) "Not only would such a transaction put his savings at unnecessary\nrisk, but it would mean that the capital would be unavailable for more productive loans."
\n"In actuality, loans to consumers that do not fundamentally improve productive capacity are a burden to both the lenders and the borrowers."
\n"Steadily dropping prices also encourage savings as islanders begin to understand that their fish would likely\nbuy more goods in the future than they do in the present."
\nKeynesians react to falling prices like a vampire reacts to a crucifix. Such a reaction is\nunderstandable when you realise that their theories are predicated on the idea that spending (i.e. consumption)\nequates to economic growth. This is why their primary course of action when faced with an economic contraction is\nmonetary stimulus. Inflation is the best way to ensure people spend what they make, because if people know prices are\ngoing to rise, they are more likely to spend their money on goods they'll need in the future.
\nI'd suggest this book for people with a cursory interest in economics but without much of a background. It's quite\neasy to grasp and would be good for young high school students.
\nI give it a 6/10.
\n", "url": "https://boyleingpoint.com/blog/posts/how-an-economy-grows-why-it-crashes-review", "title": "Book review: How an Economy Grows and Why it Crashes (2010)", "date_modified": "2021-06-08T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/act-government-subsidising-energy-price-increases-they-caused", "content_html": "Canstar recently reported that\npower bills in the ACT are set to increase by more than $200 annually.\nExpectedly, this is being labelled a "price hike" in order to lay the blame\nat the feet of the energy provider (EvoEnergy is the primary provider of ACT's\npower). As the ABC\nnotes,\nthe price increase is quite unexpected due to the relatively low wholesale energy\nprices. The 2020/2021 spot prices for wholesale energy are at about 2016 levels\ncurrently (see below).
\n\n\nSo, why are these low wholesale prices not being reflected in the ACT market? It\nis due to market interference by the ACT government. Whereas other suppliers\nwould change the retail price per kWh to track the wholesale spot market, the\nlegislators are forcing EvoEnergy to abide by long-term fixed-price contracts\nnegotiated by the government. You can almost guarantee this type of deal would\nnot happen in the free market, as most successful companies understand risk\nforecasting very well. In times of high energy costs (such as in early 2019),\nthis may have been beneficial, however, now that we're at relative lows,\nEvoEnergy must pay excess to cover the shortfall. EvoEnergy expects payments\nunder these fixed-cost contracts to more than triple from\n$42 million this financial year\nto $127 million in FY2021-22. The primary reason\nfor the government interference in the energy sector is the ACT's push for 100%\nrenewable energy (which they purportedly achieved in\nlate 2020).\nTypically, this claim of 100% renewable energy is sleight of hand, as the\nelectricity supply that powers the ACT is still non-renewable, they simply\noffset their energy usage using solar and wind farms. It's fortunate they don't\ndepend on these particular renewable sources of power as their primary grid.
\nCalifornia proved during their 2020 summer season that solar and wind\ndon't scale well to high demand.\nTheir rush to shut down their nuclear and coal power plants resulted in a\nvulnerable grid. Solar and wind in particular are vulnerable because they have\nto be in ideal conditions to effectively produce power. As the sun went down, on\nthose hot summer nights, Californians turned on their air conditioners only to\nfind that their solar network doesn't work particularly well without a shining\nsun! To compensate for the shortfall in production, they had to increase the\noutput of their coal power plants. A resilient power network surely can't be one\nthat falls apart on the first overcast, still day. China understands this\nproblem well. They are planning to build six to eight new nuclear reactors per\nyear between 2020 and 2025 (which would make them the largest producer of nuclear energy by\n2022, surpassing America and France).
\nBack to the ACT, where the renewables are a secondary part of\nthe grid, offsetting the emissions of the primary (non-renewable) part of the grid. A fully\nrenewable, reliable grid is not achievable without nuclear (with today's technology, that is), so the ACT has to\ndouble-dip to give the appearance of a fully renewable grid. What this means for\nconsumers is that they have to pay for the non-renewable AND the renewable\nportion of the electricity.
\nSo, what is the government doing to remediate the issue they've caused? Will\nthey allow the company to negotiate their pricing contracts themselves? Or\nperhaps walk back their 100% renewables requirement? Or perhaps they'll even\ninvest in nuclear? If you answered "none of the above", you'd be correct.\nInstead, the government will increase electricity concessions by\n$50 (up to $750) annually this financial year, and a further $50 (up to $800)\nnext financial year. The concession represents a\n$24.8 million \ndollar annual expense for the government, in addition to the one-off $1\nmillion investment going to the "utilities hardship fund". That's like pushing someone\noverboard and then applauding yourself when you throw them a life jacket. You\nstill pushed the poor bastard in the water but at least the TV crew was there to\nwatch you rescue him.
\nIt's the typical cycle of well-intentioned legislation:
\nThe end result is always the same. The biggest losers are the low-income\nconsumers who now have to spend more of their income for the same service. The\ngovernment spends more (likely via deficit spending), and they likely\nmakes less from taxation. You may imagine that the company is doing well, after\nall, they've raised their prices, therefore they must be making more money. That\nis not the case. The best way to make money as a business is to make your goods\nand services more affordable, and therefore allow more people to buy. As a\nresult of these price increases, people will change their behaviour to use less\npower. They'll put on an extra pair of socks, and drink their tea a little hotter this winter.\nPerhaps they'll install a wood stove to heat their home without using\nelectricity. The government doesn't even benefit, apart from their PR victory in\nthis instance, everyone is worse off when they have to pay for the moral hazard\nof well-intentioned legislation.
\n", "url": "https://boyleingpoint.com/blog/posts/act-government-subsidising-energy-price-increases-they-caused", "title": "ACT government subsidising energy price increases that they legislated into existence", "date_modified": "2021-06-08T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/protecting-yourself-from-inflation", "content_html": "Your currency is worthless.
\nWhen priced in gold, you can see that over the last hundred years (see the graph below), the US Dollar\nhas lost all of its value. The graph below plots the price of gold per ounce against the USD from 1915 to present.\nIn 1915 the price of an ounce of gold was $19.25. When the dollar was created in 1792, the cost of gold was $18.60 per\nounce, which we will refer to as the baseline value of the dollar. Between 1792 and 1915 (123 years) the price of\ngold only increased by 65 cents (roughly half a cent per year in inflation or an average of 0.028% p.a.). During\nthis period, wages were relatively flat, however, America also became heavily industrialised, and the cost of living\nreduced by half. So, not only did the value of your money remain flat, you were able to buy more goods.
\n\nYou'll note that the price of gold was flat until ~1932 when the government decided that it needed to devalue the\ncurrency, so it could create more dollars (since they were on a gold standard, they could only have as many dollars as\nyou had gold reserves), so the legislature re-defined the value of an ounce of gold to be $35. Back then, it was a \nsimple change of definition, since the dollar was still tied to, and redeemable in gold and silver. You'll notice that \nall hell breaks loose in 1971 when America left the gold standard (duping the entire world into accepting fake money \ntied to no real world value for their exports), and people were no longer allowed to redeem gold for their dollars \n(including foreign investors). Between 1971 and today, the price of gold rose from $36.56 to $1715.24 (as of March\n2021). That is a face-melting 4591.56% inflation rate in 50 years, or an average of 91.8% per year.
\nThe graph begins shortly after 1913 when the Federal Reserve Act was passed (creating the Federal Reserve), and the\nSixteenth Amendment was ratified (allowing for the government to tax income). The Federal Reserve was intended to be an\napolitical, non-government organisation to allow the creation of money to be separate from the legislature. That seems ridiculous in hindsight today, as the\nFed and the legislature are just two sides of the same spending-addicted coin, hell-bent on debasing the currency no matter\nthe cost. This isn't anything new; the Fed has been monetising the government's debt since the inception, but I think the\nwheels came off when legislature\nmade the decision to leave the gold standard to fund their war machine abroad. The few checks and balances that previously\nexisted evaporated. Before the legislature could vote to devalue the dollar, allowing the Fed to print more money, but\nafter leaving the gold standard, the Fed now digitally prints tens of billions of dollars per month to buy treasury\nbonds to fuel the stock market bubble. Similarly, the income tax was another ill-fated policy. Originally, the income tax rate\nwas 1% for people earning $0 to $20,000 (which is ~$529,911 today, according to the CPI, or $1,782,067 when priced in gold)\nwith a top nominal tax rate of 7%. Clearly, in hindsight, these low rates were the camel's nose under the tent, and this\nwas the government laying the groundwork for massive tax hikes during World War I (yet another war America didn't need to be involved in).\nBy 1918, the top nominal tax rate was a whopping 77%.
\nAll of that background is just to highlight the state of decay America is in which continues to accelerate as government\ngrows. The reason I like to view inflation through the lens of gold price is because gold has been used as money for\nthousands of years, which is a much more meaningful time-scale than the ~240 years the USD has been in existence.\nInterestingly, when you price the Dow Jones index in dollars, it's at a record high of $32,981.55 (compared to the \n$1457.37 in 1915), which is an increase of 2163%, however, when you price it in ounces of gold, today it's at 19.23Oz\n(as compared to the 2.86Oz in 1915), that's an increase of 572.4%. A far cry from the >2,000% when measured in dollars.
\nNow that you're caught up on the historical horrors of the US dollar, we can talk about the present horrors. 40% of all\ndollars in existence were printed in 2020, and already in 2021 there has been nearly half a trillion dollars added to\nthe national debt. The CPI is the measure of inflation we typically use (which is actively manipulated to understate\nthe true increase in goods prices),\nand in March 2021 alone, it measured a 2.6% increase. It is no longer a conversation of "massive inflation is coming". Massive\ninflation is HERE! Look at the price of lumber for example (see below), which has had a 47% increase so far this year, after a 125% increase\nlast year.
\n\nIt should be clear from the preceding rant that I believe gold is the best way to hedge yourself against\ninflation, especially since, unlike wheat and oil, it never decays. Remember, it's not that gold is getting more expensive, it's\nthat the dollar is getting weaker. So, trading precious metals for\ncommodities to hedge yourself against currency fluctuations is the best course of action. What should you do if\nyou don't have the available capital to buy gold? Firstly, you can buy much larger quantities of silver for far lower\nprices than you can buy gold, and it's more viable for everyday exchanges because it's more divisible.
\nAssuming you can't buy silver either, you should not let your money wither away at 0% interest in the bank or risk losing\nit on overpriced stocks (I'm not advising you pull your retirement funds out of the market, but you could consider\nmixing in some inflation hedges like gold and gold mining stocks). The best thing you can do locally is stock up on\ngoods that you know you will need down the road. In 2017 Mark Cuban (Newly converted Bitcoin bull) said that people\nstruggling to get ahead should buy in bulk and on sale. In hindsight, this was fairly prophetic considering where we are today.\nMany goods on the shelves are experiencing unprecedented price surges, and that doesn't even address the real possibility\nof serious goods shortages in the near future. Start working out how much of each non-perishable\ngood you use per month and extrapolate that for a year. Here are some ideas for you:
\nIf you use 4 rolls a month, that works out to 48 per year. Buy four 24 packs and you are set for two years without\nbuying toilet paper. Make sure you store it in a cool, dry place.
\nThis is rather self-explanatory, but you should be wary that powdered detergent only has a shelf life of 6 months, so\nconsider using liquid detergent.
\nColgate recommends a maximum of two years shelf life, so don't buy too much
\nYou can really go hell for leather with this, just consider if you have the right conditions in your house\nto store them long term.
\nIf you are curious about getting started buying precious metals, check out Schiff Gold for\nAmericans, and Perth Mint or ABC Bullion for\nAustralians. If you are worried about inflation, you should steer clear of buying gold/silver ETFs as you don't have\nthe security of physical metal, and it can be easily seized by the government. With a gold broker like the above, you\ncan store it at their secure facilities and request redemption at any time. I don't keep any physical metals as I don't\nhave anywhere to securely store it, but a bank safe deposit box would be a good alternative.
\nGood luck out there everyone.
\n", "url": "https://boyleingpoint.com/blog/posts/protecting-yourself-from-inflation", "title": "Protecting yourself from inflation", "date_modified": "2021-04-24T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/supreme-court-big-tech-censorship", "content_html": "In 2018, then President Donald Trump was sued for blocking users on Twitter. The plaintiffs argued that Trump was\nviolating their first amendment rights by blocking them as they were not able to comment on his posts. The court ruled\nin the plaintiffs' favour and Trump was ordered to unblock the users. The 2nd\ncircuit court upheld that ruling, setting the precedent that social\nmedia is the new public square and elected officials must allow all users to comment on their posts (Rep. Lauren\nBoebert is currently facing a\nlawsuit with a similar premise).
\nWhat I find striking about this decision is that the court ruled that Trump must allow users to\nengage with his posts on his personal Twitter account, in spite of the fact that they could\nstill engage with him on his official government account (@POTUS). Given that the court decided that public officials\nmust allow all users to engage with them, it would stand to reason that public officials shouldn't be excluded\nfrom using the new public square. Which is why I find it terrifying that Twitter was able to de-platform Trump before\nhis presidential term was over, and Laura Loomer was denied access to Twitter in spite of her being selected as the\nRepublican nominee for her congressional race. If Twitter truly is the public square, then why are they allowed to\npermanently ban American citizens and elected officials for expressing opinions Twitter disagrees with? Are they not\nviolating those banned user's first amendment rights to engage in the public square?
\nTrump's appeal to the 2nd circuit court decision I mentioned above recently reached the supreme court. This could have\nbeen an important moment as it's one of the first Big-tech censorship cases that would have been heard by the supreme\ncourt, but alas, the case was dismissed. Clarence Thomas concurred with the dismissal of the case but left some golden\nnuggets of wisdom in his statement.
\nThomas said "this petition highlights the principal legal difficulty that surrounds digital platforms — namely, that\napplying old doctrines to new digital platforms is rarely straightforward. ... some aspects of Mr. Trump's account\nresemble a constitutionally protected public forum. But it seems rather odd to say that something is a government forum\nwhen a private company has unrestricted authority to do away with it". Indeed, how can you say that Twitter is a\nconstitutionally protected forum when a low-level employee - whose labour is potentially outsourced to a contractor in\na foreign nation (as is the case with Youtube and Facebook) - has the ability to temporarily or\npermanently suspend the account of American elected officials? Rep. Marjorie Taylor Greene was recently\nsuspended from Twitter for an\nEaster Sunday post saying "He is Risen". Twitter later claimed that this was done "in error". How is a rogue employee\nallowed to exercise that much authority over the speech of an elected official?
\nThis is only scratching the surface, as there is also the matter of big tech firms acting in concert to ban people\nfrom all platforms simultaneously, presumably so the banned users are not able to redouble their support on other\nplatforms and prepare to mitigate the loss of audience. In 2018, Alex Jones was banned by Facebook, Apple, YouTube, and\nSpotify on the same\nmorning. Surprisingly, the only major platform not to ban him at the time was Twitter. These firms are the\nmodern day Robber Barons, but instead of colluding to fix prices, they collude to exclude people from access to\nmass communication. YouTube actually has a 3 strike system to give users some warning when they're close to being\nremoved from the platform, but they have no problems circumventing this moderation program to remove people to make\npolitical statements. YouTuber Mumkey Jones was given 3 strikes in a matter of minutes on content that had previously\nbeen moderated and found to be suitable content. Stefan Molyneux didn't even get the 3 strikes, he was just plainly\nremoved. Keep in mind, these bans are permanent. In our society, even some murderers are given a path to redemption.
\n"Today's digital platforms", Thomas continues, "provide avenues for historically unprecedented amounts of speech,\nincluding speech by government actors. Also unprecedented, however, is the concentrated control of so much speech\nin the hands of a few private parties. We will soon have no choice but to address how our legal doctrines apply to\nhighly concentrated, privately owned information infrastructure such as digital platforms." He said "The Second\nCircuit court's decision that Mr. Trump's Twitter account was a public forum is in tension with,\namong other things, our frequent description of public forums as 'government-controlled spaces'".
\nThe real meat and potatoes of the statement is where Thomas says "If part of the problem is private, concentrated\ncontrol over online content and platforms available to the public, then part of the solution may be found in doctrines\nthat limit the right of a private company to exclude". Our legal system, according to Thomas has "long subjected\ncertain businesses, known as common carriers, to special regulations, including a general requirement to serve all\ncomers". An example of such a common carrier would be telephone companies. The telephone infrastructure was built by\na private company, but existing laws dictate that those companies must not be able to pick and choose customers based\non their ideological bent.
\nIf these individual companies are treated as carriers for a particular service (i.e. Twitter\nis the common carrier for Tweets, Facebook is the common carrier for boomer memes), then they will probably just have to\nstop banning users who are posting lawful content to comply, but if the service is "social media", and all social media\ncompanies are considered common carriers for "social media", I believe there's far more dire consequences for them than just\nhaving to platform ideas and people they disagree with. What happens when I sign up for AT&T and I try to call you\non Verizon? The call connects seamlessly in spite of us being platformed by different companies, anywhere in the country, at any time.
\nWhat I think is next for social media companies as common carriers is that if I make a post on Facebook, you must be\nable to access it, and engage with it on Twitter. This essentially means, there would be some standard shape and way for\nsending, and storing social media content. Meaning all existing, and future social media companies\n(Thomas also says that "no substantial market power is needed so long as the company holds itself out as open to the public")\nmust adhere to this standard, ensuring complete suffocation of innovation in this space.\nAs is the case with telecommunications companies, and banks, this may also mean that Twitter and Facebook would have to\nabide by KYC (Know Your Customer) legislation, and user's real world identities would be tied to their accounts, and\nreported to regulatory bodies. That may also mean the companies must comply with counter-terrorism legislation and\nreport users engaging in potentially illegal activities (or face large fines for not reporting it).
\nEssentially, only extremely large companies would have the resources to abide by the legislation,\nand the government will have destroyed the growth of yet another industry. I think that is a long shot,\nespecially considering Facebook is now the largest corporate lobbyist,\noutspending even the telecom, and defence companies (and that doesn't include the nearly $500 million Mark Zuckerberg\n"donated" to fund the 2020 presidential election), but\nI believe it's one of the most important issues we presently face.
\nI look forward to seeing future supreme court cases on this topic, particularly the statements of Clarence Thomas\nwho has consistently been the voice of reason on the bench. Clarence Thomas' senate confirmation hearings were oddly\nreminiscent of the Brett Kavanaugh hearings in 2017 as a report of alleged sexual misconduct was leaked to the press,\nand his reputation was dragged through the mud. Thomas proceeded to dunk on the committee of dickheads (lead by then\nSenator Joe Biden) with one of my all-time favourite quotes:
\n\n\n", "url": "https://boyleingpoint.com/blog/posts/supreme-court-big-tech-censorship", "title": "Justice Clarence Thomas on Big Tech Censorship", "date_modified": "2021-04-06T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/wisconsin-election-fraud", "content_html": "This is not an opportunity to talk about difficult matters privately or in a closed environment. This is a circus.\nIt's a national disgrace. And from my standpoint, as a black American, as far as I'm concerned it is a high-tech\nlynching for uppity blacks who in any way deign to think for themselves, to do for themselves, to have different\nideas, and it is a message that unless you kowtow to an old order, this is what will happen to you. You will be\nlynched, destroyed, caricatured by a committee of the U.S. Senate rather than hung from a tree.
\n
According to the vote pattern analysis performed by\nthe Voter Integrity Project (VIP), out of the 8,954 vote count updates, there were 4 decisive updates that essentially\nsecured the election for Biden. According to VIP these were the "outliers of the outliers".
\nThe analysis alone is not conclusive evidence of fraud, but we have begun to get a clearer picture of the events that\ntook place during the early hours of November 4th - the day after election day - when Biden took significant leads in\nmultiple states. When you combine the statistical anomalies with what I'm about to show you, it tells us without\na doubt that there can be zero confidence in the accuracy of the results in these two states.
\nThe now infamous video of the election counters "pulling suitcases out from under tables" takes centre stage in this\nstate. This footage broke and was summarily "debunked" by leadstories,\nhowever, their fact check leaves a lot to be desired.
\nOn the suitcase claim they said "The officials said the ballots seen in the video were in regular ballot containers -- not suitcases".\nYes, they are in fact regular ballot containers, and in typical deboonker fashion, they opt for playing linguistic games\nfor the purpose of slapping politically motivated fact checks on Facebook posts. They claim this was not in fact illegal\nballot counting as the witnesses left of their own volition, however, two of the witnesses signed affidavits saying\nthey were asked to leave as counting had concluded for the night. Also, it's worth noting that the attribution\nof this fact check has been switched from the "journalist" who wrote it originally to the big man Alan Duke. Alan,\nwas it worth it? Do you feel good about yourself now you gave up your credibility to sell fact checks to Facebook?
\nRegardless of them disputing the fact this was legal ballot counting, we know for sure they DID count ballots, as\nwe have the footage. According to The Epoch Times\nthey stayed back and counted from 11pm to 1am. Imagine my shock when I found out that corresponds pretty closely with\nthe 1:34am update that the VIP identified as the third most anomalous update in the set. That update had 136,155\nvotes for Biden, approximately 11x the current margin of victory.
\nIn the early hours of Nov 4th, 2020 (why is it always the early hours when all these oddities occur?), we learned\nthat Claire Woodall-Vogg, the election official in charge of collecting the USB drives from each tabulator\nmanaged to forget a USB and leave it\nat the voting precinct. These USBs contain all the votes for the election unencrypted, and in .csv format...
\nIn her statement about the incident\nWoodall-Vogg said:
\n\n\nOn November 4th, around 3:00am, the City of Milwaukee finished counting absentee ballots,\nand I began to export the results from Tabulator 7. Tabulator 7 was the last to finish\nprocessing ballots and was the only remaining flash drive to be burned. As I burned the flash drive,\nwhich can take up to 10 minutes, Milwaukee County Election Commission Director Henry asked that I\nbring a report for each tabulator regarding the number of ballots processed per precinct.
\n
Woodall-Vogg then processed the reports as requested, and delivered the USB drives to the Election Commission,\nand she discovered that she had left the USB drive in Tabulator 7! I'm shocked! She continued, "I immediately called\nKimberly Zapata, a member of my senior leadership team, who was still present at Central Count and confirmed it was\nstill in the machine". She just admitted that she left the USB drive unattended in a building where poll workers were\nstill working. Immediately apparent, the chain of custody has been broken. There is a reason all paper ballots, after\nelection day must be sealed in bags with special security tags that break if removed. There is no assurance that the USB\nwas not tampered with, and there can be no such assurance without an independent forensic audit of the USB device to\ndetermine if the votes had been manipulated (Woodall-Vogg claims the export timestamp is the same, but excuse me for not\nbelieving the classic "we investigated ourselves and found no evidence").
\nThe letter continues, "Ms. Zapata gave the flash drive to a Milwaukee Police Department Officer who delivered the flash\ndrive approximately 10 minutes later". Her statement begins at 3:00 and she says 10 minutes twice. Accounting for\ntravel time, that would place the time she recovered the USB at roughly 3:30am (rough approximation). Assuming she went\ndirectly to upload the votes, wouldn't you know, that corresponds almost exactly with the anomalous update identified\nby the VIP at 3:42am where Biden took the lead. See the graph below, and you'll be able to see the update immediately.\nHint: It's the part where Biden overcame a 100k Trump lead in a single update.
\n\nSo, people of Milwaukee, your election officials think you are stupid. They are expecting that you won't\nbe able to draw the connection between this extremely suspicious vote count update, and a KEY election official\nlosing a USB containing votes. This person single-handedly destroyed the integrity of your election. At best, she\nmade an honest mistake and someone took advantage of it to manipulate the votes, and at worst, she was involved\nand assisted by crafting this story and lying to the public.
\nTo make matters worse, Joe Biden won a record low of 14 counties in Wisconsin while still winning the state. The next\nclosest to him in terms of the lowest number of counties won is Obama in 2012 at 35. Demand a good faith audit now, or\nthe next four years will be a dark cloud over the executive branch.
\n", "url": "https://boyleingpoint.com/blog/posts/wisconsin-election-fraud", "title": "Wisconsin and Georgia: Impropriety or fraud?", "date_modified": "2020-12-06T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/sitemaps-for-next-js-static-sites", "content_html": "I just recently re-built my Gatsby site using Next.js. I liked Gatsby for a while,\nhowever, I had a few issues:
\nthe build process has always been dodgy for me,
\nthe watch (i.e. gatsby start
) failed after being up for a while
builds didn't work on Windows Linux Subsystem
\noverburdened with configuration modules
\n\nThe Lighthouse audit results after my first round of changes
\nThe biggest selling point for me is the getStaticPaths
function in the Next.js pages.\nBefore, as a pre-build step, I was generating the entire page tree of React components using a node script. Super heavy handed, and I'm sure\nthere's better ways to do it in Gatsby. What I'm doing now looks like this:
.\n└── pages\n └── blog-posts\n └── [year]\n └── [month]\n └── [title].tsx
\nThe resulting output is visible in the address bar in your browser. Blog posts routes look like: /blog-posts/2020/08/some-name
[title.tsx]
export function Post() {}\n\nexport async function getStaticPaths() {\n const blogPosts = await getBlogPosts();\n\n const paths = blogPosts.map(\n post => `/blog-posts/${post.year}/${post.month}/${post.title}`\n );\n\n return { paths, fallback: false };\n}
\nIn the getStaticPaths
function you return a list of new paths and Next.js automatically spits those pages out. At\nbuild time, you can then use the path parameters to fetch external data and build your components. What this means, in\neffect, is that your /pages
folder no longer maps 1:1 to the static output. So you can't just build a sitemap off\nthe page directory anymore.
There's a comprehensive article on the topic by Lee Robinson (https://leerob.io/blog/nextjs-sitemap-robots)\nbut this guide also assumes your source pages are 1:1 with the expected output. I adapted his script to build based off the folder output instead.
\nyarn add -D glob [chalk] [prettier]
import glob from 'glob';\nimport fs from 'fs';\nimport { red } from 'chalk';\nimport prettier from 'prettier';\nimport prettierConfig from './.prettierrc.js';\n\n(() => {\n // default next js output is `out`\n // all the pages are guaranteed to be html\n glob('./out/**/*.html', (err, files) => {\n // If there's no files in the output, a build probably hasn't been run\n if (!files.length) {\n console.error(red('Could not find output directory'));\n process.exit(1);\n }\n\n const sitemap = `\n <?xml version="1.0" encoding="UTF-8"?>\n <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">\n ${files\n .map(page => {\n const path = page.replace('./out', '').replace('.html', '');\n const route = path === '/index' ? '/' : path;\n\n return `\n <url>\n <loc>${`https://{Your Domain Here}${route}/`}</loc>\n <changefreq>daily</changefreq>\n <priority>0.7</priority>\n </url>\n `;\n })\n .join('\\n')}\n </urlset>\n `;\n\n // Optional: you can remove this block if you aren't using prettier\n const formatted = prettier.format(sitemap, {\n ...prettierConfig,\n parser: 'html'\n });\n\n fs.writeFileSync('./out/sitemap.xml', formatted);\n });\n})();
\npackage.json
{\n "scripts": {\n "start": "next start",\n "build": "next build && yarn run build:sitemap",\n "build:sitemap": "node ./generate-sitemap.js"\n },\n "devDependencies": {\n "chalk": "^4.1.0",\n "fs-extra": "^6.0.1",\n "glob": "^7.1.3",\n "prettier": "^1.18.2"\n }\n}
\nThat's pretty much it for my implementation. You can see my sitemap\nhere https://lukeboyle.com/sitemap.xml.
\n", "url": "https://boyleingpoint.com/blog/posts/sitemaps-for-next-js-static-sites", "title": "Sitemaps for Next.js static sites with dynamic routes", "date_modified": "2020-08-15T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/do-not-trust-google", "content_html": "\nI realise that this is one of the most well-explored topics on the\nprivacy-conscious edges of the internet, but seriously... Do not trust Google.\nFacebook seems to be our current punching bag of choice because of their supposed\nability to manipulate political opinion, but in my opinion Google is a much more insidious company\nwith far greater potential for abuse. Google is the largest advertising platform by\na significant margin (accounting for 36.3% of advertising in the U.S. with Facebook trailing at 19.3%). At the\nend of the day, if you delete your Facebook account, what are you really missing out on?
\nGoogle (or more specifically Alphabet Inc.) owns the largest search engine (Google.com), the largest video streaming platform (Youtube), and the most-used smartphone operating system (Android). You might ask, "What's wrong with that? Sounds like they're just very successful at what they do". Well, let's break down those three markets (Search, Streaming, Mobile).
\n\nGoogle's estimated market share for search traffic globally is 92.16% source.\nAs people increasingly are using search to navigate the web (as opposed to typing a URL into the address bar), this\ntraffic increases, those people see more ads, Google makes more money. Google then uses this money to purchase exclusivity\nagreements with the likes of Apple (just two years ago it was announce that Google would be paying $12 billion US to Apple to remain\nthe default search engine on Safari in 2019 source at a cost of roughly\n$10 per user).
\nIf you ask the average user how Google search works, they'd probably say "it just searches for your search term across the web", but what\nthey probably don't know is that is just the tip of the iceberg. Other dimensions of search include:
\nThere's certainly an argument to be made for suppressing some search results, such as pro-authoritarian sites\n(e.g. communist or fascist), extremely fringe conspiracy, illegal pornography, or bomb-making instructions. Advertisers\nprobably don't want their ads next to those results. Rightly or wrongly, Google is already suppressing content from\nsuch websites (though, they're probably still indexing them).
\nIf Google can suppress fascist content from sites like Stormfront (prominent white-supremacy forum), then who is to\nsay which content they can or cannot suppress? Breitbart is a well-known right wing news site that has had their\ncontent almost entirely purged from\nGoogle search results (as evidenced by the "search engine visibility" chart below).
\n\nYou don't have to agree with them politically to see that Google is applying different standards to conservative content\nthan to more liberal content. I don't visit Breitbart, I don't read their articles, and frankly I don't give a shit\nwhat they have to say, but I believe in a free and open internet. If you believe in a free and open internet then you\nhave to agree this is wrong. During the cold war, anyone who didn't follow the extreme protectionist beliefs of the\ntime were shouted down as communists (Even Martin Luther King Jr.\nwas dismissed as a communist by J. Edgar Hoover). This same thing is happening now, but the buzz word is different.\nThe new weaponised word is "Nazi". If time had elapsed differently, I have no doubt that it would be left-wing\nwebsites suppressed in search results, and that still wouldn't be okay.
\nThere's plenty of evidence to suggest that Google is manually making these decisions to block conservative websites,\nhowever, Alphabet CEO Sundar Pichai denied that they manually censor websites at the\nrecent Congressional antitrust hearing except\nfor in cases where there are legal requirements or copyright issues. I don't buy that, personally.
\nWhen YouTube was founded it was facing severe scaling problems (because video processing and streaming is extremely expensive).\nFortunately for them, Google saw potential in the platform and purchased the company for $1.65bn in Google stock, and their money\nissues were over. Google was throwing money into scaling the platform, and it was experiencing great growth. This success turned\nout to be a major problem for the YouTube, because, from the time it was purchased it has been making a loss. In recent years,\nYouTube has become profitable, however, without the bottomless pockets of Google behind it, they never would have been\nable to accomplish this. What incentive could Google have to take losses year after year on YouTube? Well, it turns\nout user data is particularly delicious. Mastercard's CEO has infamously said "data is the new oil".\nI personally can't wait for Facebook, Amazon, and Google to become para-military organisations in the up-coming data wars.
\nYouTube has essentially bullied their way into market dominance using Google's bottomless pit of money. This is problematic\nbecause it allows failing companies to cheat death. Just like a bottom-feeding fish,latching onto a whale shark and hitching a ride.\nAs I mentioned before, video streaming is extremely expensive, so it makes sense that great cloud infrastructure is a prerequisite to\nsuccess. Well, big surprise, Google offers world-class commercial cloud infrastructure with Google Cloud Platform (GCP)!\nDo you suppose YouTube is paying full price for their infra?
\n\nSo, when you see a headline that says "Stop paying for iCloud – Google One will now back up your iPhone for free".\nBefore obeying the shill who wrote it, you should ask yourself, "How can a company afford to give away so much storage space for free?". Well, they can't.\nGoogle simply obscures their losses with the immense revenue from Google Ads in the profit/loss statement at the end of each quarter.\nFor more reading about this topic, Tim Bray has a fantastic article called "Break up Google".
\nThis article is already becoming too long, so I'm just going to cover mobile quickly. As Tim Bray mentions in the\narticle above, Android isn't really a business. The only real non-ad revenue they have is from the commission they get\nfrom app purchases and licensing fees from OEMs (e.g. Samsung, Huawei, LG). How, then, are they able to sustain\nhundreds of highly paid engineers and all the other non-technical staff required to support the system?
\n\nAbove is a map of Android vs iOS market shares. You can see that iOS pretty much only has the dominant market share\nin first world countries (like USA, Canada, Australia, UK, Japan). Most of the emerging countries in the world are\nstrongly in favour of Android because, unlike Apple, the OS is not restricted to a particular device. So, countries like\nIndia (where the number of smart-phone users has increased sharply from 199 million in 2015 to 401 million in 2020 source)\nthat mostly purchase low-cost phones (e.g. Huawei, Xiaomi, Oppo). Emerging markets are extremely important to companies\nlike Google partly because these countries are easier to exploit because they don't have strong legislation to protect\nusers from predatory advertising, anticompetitive tactics, or data privacy. This is why I speculate that Mastercard is\nscrambling to connect refugees to the global payment network (Remember that quote from the Mastercard CEO: "Data is the new oil")\nand, indeed, why Mastercard forced Patreon to ban Robert Spencer for his anti-refugee sentiment.
\nAgain, regardless of whether you agree with someone's political leaning or rhetoric, I shouldn't have to explain\nwhy it's ludicrous for people to believe that faceless, soulless corporations such as MasterCard or Google give\ntwo fucks about moral righteousness when their only servant is a number ticker on the Nasdaq website.
\nSo, after reading all of that, I have to ask:
\nWhy don't you route all of your web traffic through Google Servers?
\n\nTo be clear, I'm not accusing Google of storing DNS logs or associating that with specific users (they claim that they\ndon't in their terms of service), however, I think it's unreasonable to think that they wouldn't be capable of that. I\nalso wouldn't put it past them to lie in terms of service considering their recent run-ins with the law ($1.7bn fine for anti-competitive behaviour, $170m for violating children's privacy on YouTube, 50 million euro fine for GDPR violations).
\n$2bn doesn't matter to Google. It's a drop in the bucket, especially considering they would probably be able to freely\nharvest user data for months or even years before they're caught and slapped on the wrist. If a single user's search data\nis worth upwards of $10 a year (see the Safari Google default search engine deal) for Google, then the complete logs of\ntheir browsing history would be quite juicy indeed.
\nOkay, so that's verging on conspiracy theory I suppose. Maybe Google DNS will remain clean. How about you get a Google®\nNest™ WiFi mesh router and let them inspect all of your web traffic that way?
\n\nOr perhaps you want to buy the new Pixel and give them advanced analytics about how you use your phone (privacy class action lawsuit), everywhere you\ngo (Location History), how much physical activity you do (Google Fit), every article/video you engage with (Chrome),\neverything you buy (Google Pay - and incidentally, how much disposable income you have, so they can better target more\nrelevant ads to you). All of these "services" are simply a ruse so that Google can build an extremely accurate profile\nabout the type of consumer you are and target you with more advertising to turn you into a soulless consumer.
\nI don't want these people to also be the arbiters of what content I should or should not be able to see online.
\nWell that was pretty depressing. So, how can you reclaim a shred of your privacy?
\nThere's a swathe of privacy-focused alternatives popping up these days. I personally\nuse duckduckgo.com which\nis built on the Bing search API and does not track any user data. I'll concede that Duckduckgo doesn't have as\ngood search results, but I'm okay with that. Another one is https://www.startpage.com/ which\nactually uses Google results, but ensures Google can't track your activity.
\nI'm currently using invidio.us which, like Startpage, is just a wrapper for YouTube. So you can get the\nsame content minus the tracking. Bonus, check out Invidition on the mozilla extension store to\nopen all youtube links in invidio.us instead.
\nI really don't have an answer for this one. I'm an iPhone user, but really, Apple is not much better, especially if\nyou care about having a repairable device. If you really want to go hardcore there's some custom Android forks like https://grapheneos.org/
\nI didn't really touch on Chrome, but I'm not happy with Chrome either. Since Edge has switched to using Chromium the\nonly real competitor (i.e. non-Chromium) in the browser market share is Safari. I use Firefox because I believe\nin Mozilla and their commitment to maintaining privacy. They're doing good stuff lately.
\n", "url": "https://boyleingpoint.com/blog/posts/do-not-trust-google", "title": "Do not trust Google", "date_modified": "2020-08-01T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/olad-results-so-far", "content_html": "3 months ago I started a new OLAD program. So far it has been a massive\nsuccess. My lifts are up considerably, and I'm just generally enjoying my time in the gym.
\nSo how did I "increase my 1RM by 18-35%"? One simple trick! "Adherence" (and maybe a bit of residual newbie gains).
\nAdherence before OLAD
\n\nAdherence after OLAD
\n\nI really just owe this adherence to the renewed enjoyment I've had in the gym. It's pretty great to get out of the gym\nwithin 45 minutes (excluding prehab/rehab). I was quite fatigued by the few cycles of 5/3/1 I had just done (not to\nmention the 1.5 hour workouts), so it made sense I was burned out.
\nAll things considered, I'm really happy with this program. After my second knee dislocation in 2019 I didn't expect to\nbe squatting again but here we are. It's not the most encouraging sign when your bench is beating your squat, but I'm not giving up.
\nI have no recent squat footage so here's some of this weird reverse safety bar front squat thing I got from John Meadows
\nNew lifters often place far too much importance on choosing the right program.\nStronglifts or Starting Strength? 5/3/1 or PHUL? In reality, the right program\nis the one that gets you in the gym consistently. For 9 months (minus a couple\nfor a patellar dislocation) I haven't been running a program, instead I get to\nthe gym in the morning and decide on my core lift. The chosen exercise depends\non a few factors like, how much energy I have, how my joints feel,\nhow INTENSE I'm feeling...
\nMy progress plateau might suggest that this experiment was a failure, however, it\nhas made the gym far more enjoyable than cranking out the same repetitive workout\nweek in and week out. I also noticed that I end up spending far more time on the\ncore exercise and often won't add any accessories. Workouts are overall shorter and\nmore satisfying. I've also been able to rotate in more variations (e.g. push press,\npin press,safety bar squats, deficit deadlifts) which helps with lift boredom.\nThe next logical step from here would be to make my workouts more consistent and strategic.
\nThe One Lift a Day (OLAD) system has gained more popularity in recent years. Eric Bugenhagen\nhas been championing OLAD for years and Alec Enkiri\nrecently broke down his OLAD program. Given his insane\nstrength and general athleticism (585lb deadlift, 4.5 second 40, 60" box jumps) it's\nalways interesting to see how his programs reflect that. I challenge you to find a cookie\ncutter program that includes resisted sprints. The exercise selection with OLAD is entirely\nup to you and should be based on your goals, but Alec suggests including a squat, hip hinge,\nloaded carry, horizontal press, vertical press, upper body pull.
\nFor rep schemes and progression I turned to Dan John's\none lift a day program This\nprogram is built on 4 week cycles like so:
\nI'm going to be doing this program for 3 months (i.e. 3 of these 4 week cycles).\nMy exercises (with a recent set in brackets):
\nI'll be documenting progress for the next 3 months and we'll see if the gains gods bless me.
\n", "url": "https://boyleingpoint.com/blog/posts/experimenting-with-olad", "title": "Experimenting with OLAD (One lift a day)", "date_modified": "2020-04-10T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/mac-miller-circles", "content_html": "\nOne of the unfortunate realities of life is that people die. The music industry,\neager to exploit, will often take this opportunity to cash in. They'll\nbastardise the hard work of productive musicians and release\nposthumous albums, and countless remixes of classic songs from various artists.\nLook no further than the discography of Notorious B.I.G, 2pac, Big L, or more recently, XXXTENTACION.\nThe latest project from X's estate was an album released in late 2019. Out of 25 tracks, 17 tracks had\nfeatured artists for a total of 21 guest verses.
\nThe announcement of Circles was something different. Mac was working diligently\nto finish this album before his untimely death in 2018. A significant portion of\nSwimming and Circles was "executive produced" by Jon Brion who carried the torch and tried to finish\nthe album as Mac intended it. When I initially saw his writing credits on Swimming I\nwas quite surprised.
\nI pondered the mystery of how you connect the dots between an eccentric composer who is most\nwell known for his work on movie scores and a trendy rap artist. Turns out, it's not that\nmuch of a mystery at all. In an\nepisode of What's In My Bag with Amoeba records Jon Brion says, "You know,\nmy complaint about most people who make records and go out and shove their shit down people's throats is that...\nall I see them doing is giving me their impression of what they think they're supposed to be doing. And it's\nwhat bores me about 99.999% of people who make stuff." For a man who said that to clear his calendar and\nhelp finish Circles, he must have really been excited about working with Mac.
\nIf you consider his work in film, Jon really does a masterful job of\nconveying tone.\nI have to thank Jon for the inspiration/encouragement he gave Mac, but in\nhis interview with Zane Lowe, he insists\n"That's not something I created, that's something he was doing and I was only\nasking him to recognise that it was already great."
\nThe album is littered with references to time. More specifically it seems to be about how you perceive time.\nDo you let time be a tyrant in your life and fight it, or do you go with the flow and ride it out?\nOne of the greatest albums of all time is Dark Side of the Moon which also featured references to time\nvery prominently. Dark Side of the Moon starts and ends the same way, with a heartbeat. This seems to\nrepresent the cycle of life: a notion that is paralleled in the opening track Circles. The line goes\n"I just end up right at the start of the line. Drawin' circles." Similarly, every day ends\nthe same way it begins; the hands of the clock go around in a circle until they strike 12.
\n\n\n\nHey, one of these days we'll all get by
\nDon't be afraid, don't fall
\n
After the very sombre start, the synth and funky bassline (shout out to MonoNeon)\nof Complicated was fairly jarring on first listen. After going back through for a second time, it really just seemed\nto make sense. The lyrics are very dour which contrasts\nbeautifully with the rather upbeat and playful instrumental. Many songs use this tactic like a trojan horse to insert\nsome meaning into poppy songs, because they probably wouldn't top the Billboard 200 if the instrumentals matched the\nlyrics. (See Hey Ya by Outkast). A recent example is in the Purple Mountains\nsong All My Happiness Is Gone. The late David Berman said in an interview "it just complexifies the profile of it to have the music and the words at odds". As it turns out, Complicated wasn't\noriginally made for the album[1], but it fits beautifully. Leading on from Come Back to Earth on Swimming\nwhere he says "I just need a way out of my head", Complicated has the first reference to his difficulties unravelling\nthe mess in his head which ties into Good News later in the album.
\nWith Circles and Complicated, I get the impression that Mac was living in the present which is a comforting way to\napproach life when you're going through a hard time. Lyrics like "I've got all the time in the world, so for now\nI'm just chilling" and "'fore I start to think about the future. First, can I please get through a day?".\nHe's taking life one day at a time and working through the "clutter" in his head. The downside of this approach is you\ncan easily forget about the bigger picture and end up clinging to destructive coping mechanisms.
\n\n\n\nWon't give a fuck about tomorrow if I die today
\n\n
Aside from a few leaks like "Telescope" (which became Woods) and Once a Day (which was played during his A\nCelebration of Life as a straight piano ballad), Good News was the only music released to promote the album\n(perfect choice). Mac opens the track talking about fighting his demons\n("I spent the whole day in my head, Do a little spring cleanin'"), but often feeling hamstrung by his own instincts to\nself-sabotage. He says, "I wish that I could just get out my goddamn way", and "Why I gotta build something beautiful\njust to go and set it on fire?"
\nWith a different interpretation, these lyrics could have been delivered with a very moody inflection to create a much\ndarker tone, but the muted string plucking and sparse instrumentation give this very calming, ethereal feeling.\nThis track also features guitar from Wendy Melvoin (guitarist for Prince's band, The Revolution), considering\nthat John Mayer played guitar on Small Worlds, it's really clear how infectious Mac Miller's talent was.\nThese great musicians happily collaborated with him with very little public recognition. Paired with the subdued\nvocals it is just a beautiful tribute to his life and legacy and a reassuring reminder from the great beyond.
\nEspecially when coupled with the music video, it has a surreal quality that feels\nlike he's in the room talking to you. It almost sounds like he's reassuring people from the great beyond with lyrics\nlike "There's a whole lot more for me waitin' on the other side". It's a story of the immortal quality of\nmusic. So, it only makes sense that an image of Mac appears in a Lotus flower which has a symbolic meaning in Buddhism.\nIt resembles the purifying of the spirit which is born into murkiness[2]. The ending of him walking through\nthe airplane window and disappearing as a ripple in the water is superb and continues to drive home the idea of swimming.\nGood News is the best send-off anyone could hope for.
\n\n\n\nI was drowning but now I'm swimming
\n
Woods is my favourite track on the album. It's probably the most subdued track with very sparse lyrics and\ninstrumentation. The opening lines of "Things like this ain't built to last, I might just fade like those\nbefore me" could be interpreted a few ways. This album was recorded shortly after the loss of a\nrelationship. It feels like saying "If you were able to forget the people you used to love and love\nme, then surely you'll be able to forget me and move on". Another interpretation could be his anxiety\nabout his legacy and whether he did enough to be remembered.
\nHand Me Downs is a testament to Mac's commitment on his last two albums. In my opinion the best albums seldom have\n(many) artists featured. To me, it exudes a lack of confidence in what you're producing and a level of insecurity that\nyou feel as though you need to attach bigger names to your songs for them to be well liked. The feature on this album\nby Baro Sura was a very purposeful choice and it only seems to serve the album's narrative and vibe. And with Baro\nbeing a relatively unknown artist it feels like this was someone Mac really liked and believed in, and he delivered\na great performance.
\nI've written before about how I like albums the most when there's a consistent vision from the lyrics to the production.\nOut of all of his studio albums, Mac seemed to take care of around 30-50% of the production and outsource the rest of\nit to other producers[3]. Circles, however, sees a staggering 75% of production handled by\nMac. For whatever reason, the production is credited to Mac Miller, not his oft-used production moniker "Larry Fisherman".\nI like to believe that's because this send-off is a time capsule for us to see who Malcolm really was around the time\nof his death. A truly honest expression of Mac the musician.
\nHands takes the metaphor of time further. To me, the beat sounds reminiscent of a clock ticking, and hands\nseems to refer to the hands of a clock. It seems to be a letter to himself urging him to ease up\non himself and stop feeling so low. "When's the last time you took a little time for yourself?"\nand later in the chorus "No, I stay behind the wheel and never half-speed". "Never half-speed"\nmight suggest that he is always speeding and might benefit from slowing down. This hearkens back to\nthe line on Small Worlds (Swimming): "I'm always in a rush, I been thinking too much".
\nThroughout the whole album it feels as though there's two versions of Mac. There's the self-reprimanding Mac with\nunrelenting standards (the Yin, if you will) and the Mac that is more forgiving and reinforcing that his imperfections\naren't the sum of his existence (the Yang). If he could just achieve a balance between these two sides then maybe\nlife would be just a little bit less complicated. This concept of Mac's duality could also explain why the album art\nfeatures two images of himself superimposed.
\n\n\nYeah, why don't you wake up from your bad dreams?
\nWhen's the last time you took a little time for yourself?
\n
Strangely, it feels as though any track on the album could have been the last track and it wouldn't\nbe lacking. But what's important is that the final track represents the last goodbye. The important\nthing to nail is the tone. Jon Brion made the decision to put Once A Day at the end, and it\nwas a very intentional decision. What message did he want to leave us with for the final act of Mac?\nOnce a day opens with a monologue "Once a day, I rise. Once a day, I fall asleep with you". He's\nreally talking to himself. You spend more time with yourself than anyone else and it's important\nfor you to be comfortable with that. "Don't keep it all in your head. The only place that you know nobody\ncan ever see". Once A Day is a tale of inner peace and a final reminder that we need to stay\nopen, not just to others, but most importantly, to ourselves.
\nThe painful part of listening to this album is the feeling of finality.\nThere will never again be a Mac Miller album that was wholly - or even\npartially - designed by him.\nThe circle is often seen as a symbol of permanence and immortality, like\nthe Ouroboros (a snake eating its own tail) in Greek mythology. So, it's only\nfitting that the album to solidify his legacy in our minds is Circles.
\nCircles feels like a realisation that life isn't a comedy but it's more\nof a tragedy with comic relief. True maturity is welcoming that reality\nbut recognising that sometimes the best you can do is keep Swimming in Circles.
\n\nArguably, the key feature that made Gitlab a market leading platform was\ntheir decision to build the platform as an end-to-end application\ndelivery service including version control, CI, Infrastructure,\ncommunity engagement, and so on. The simplicity that comes with this\ncentralisation made Gitlab really stand out when compared to the\nAtlassian suite of Bitbucket, Jira, Bamboo. Even more when compared\nto Github at the time, since their market offering pretty much started\nand ended at git (with some other things like gh-pages, marketplace, etc).
\nIt has been a couple years since Gitlab's rise to prominence and the\nmarket has certainly shifted. Even before Github was acquired by Microsoft\nin Mid 2018 (source),\nthey were hard at work pushing out feature after feature.
\nOff the top of my head, I can recall these:
\nGithub actions is now in open beta\n(you can opt in here: https://github.com/features/actions)\nand it enables you to set up containerised builds, testing, deployments\nin response to many github events (push, pull requests, tags, schedule).
\nThe process is much the same as something like CircleCI, Travis, or Buildkite.\nThe integration for CI checks on pull requests and commits has been in\nGithub for years, allowing early warning for pull requests that break\nthe build.
\nIn this post I'll be showing you how to set up to build and release\na single-page app running React.
\nKeep in mind that the v1 Github Actions syntax has been deprecated, so make sure you are looking\nat the yaml documentation. There's a handy warning at the top of the deprecated pages:
\n\n\nThe documentation at https://developer.github.com/actions and support for the HCL syntax in GitHub Actions will be deprecated on September 30, 2019. Documentation for the new limited public beta using the YAML syntax is available on https://help.github.com.
\n
Find the docs here: https://help.github.com/en/categories/automating-your-workflow-with-github-actions
\nFor this example, I'll be using Create React App. Initialise that if\nyou'd like to follow along, or just retrofit an old, simple project.
\nThere's two flows I want to create
\nLet's create the action file.
\nCreate a folder in the root of your repo .github/workflows
\nCreate a file in that folder called ci.yml
Let's look at the ci.yml file and add some boilerplate
\nci.yml
name: CI\n\non: [pull_request, push]\n\njobs:\n build:\n runs-on: ubuntu-18.04\n\n steps:\n - uses: actions/checkout@master\n - name: Use Node.js 10.x\n uses: actions/setup-node@v1\n with:\n version: 10.x\n - name: Build\n run: |\n npm install\n npm run build --if-present
\nThe first thing to note is on line 3, there is an option called on
(docs for on
. This field is a list of signals you want to respond\nto. For this one, I'm only doing it on pull request. Because this on
property is at the top level, regrettably you\ncan't combine all your steps and choose not to run some steps on pull request. This is the reason for having two\nseparate action files. In principle, the actions should be entirely self contained processes.
The jobs is a list of independent actions. By default, they run in parallel. You could use this to separate things\nlike your unit and integration tests to speed up your CI. This example is pretty simple, so I haven't found a use\nfor the jobs yet.
\nThe steps field is quite simple in this example. For each step, you can chose to specify the uses
field (docs).\nThe format for this argument is [owner]/[repo]@[ref]
or [owner]/[repo]/[path]@[ref].
. You can reference actions in\nyour current repository or you can reference standard actions as per the example above.\nactions/checkout@master
checks out the current branch. actions/setup-node@v1
sets up Node, probably\nthrough a Docker container. You can provide arguments to the action using the with
key.
Now, the magic begins. Go to your repository and visit: https://github.com/[yourName]/[yourRepo]/actions
. You'll be prompted\nto enable Actions for this repository. Hit enable and then commit your ci.yml
file, push it up and check the Actions tab.\nYou should begin to see your commits start popping up under the relevant action.
In the image below, you can see the left side has the name of the action, the event that triggers it, and the jobs below that.
\n\nWith luck, we now have our CI build successfully running.\nOnto the deployment action. Copy the below to your ci.yml
\nci.yml
name: CI\n\non:\n pull_request:\n push:\n branches:\n - master\n\njobs:\n build:\n runs-on: ubuntu-18.04\n\n steps:\n - uses: actions/checkout@master\n - name: Use Node.js 10.x\n uses: actions/setup-node@v1\n with:\n version: 10.x\n - name: Build\n run: |\n npm install\n npm run build --if-present\n - name: Deploy\n if: github.event_name == 'push' && github.ref == refs/heads/master\n env:\n AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}\n AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET }}\n run: node scripts/deploy.js
\nYou'll note that at the moment we're executing this on both:
\nThis means that unless we add a filter, we'd be deploying branches on\nany pull request, which could probably break our app.
\nTo the Deploy
step, we've added an if. This if should have a boolean\nvalue that will determine whether to run the step or not.
You could do things like check if a step was successful, or in our case:
\nMoving onto deployment, if you look at the env key, this is how we\nprovide environment variables to the step. These are accessible in\nnode scripts via process.env
. SOME_API_KEY
in this example is a\nhardcoded string. Github also provides a secrets manager within your\nrepository. Don't worry about that node script yet.
At a previous job, they outlawed all external CI services because they were worried about their AWS IAM keys getting\nout in the event of a CircleCI data breach. Given that we're dealing with Github + MSoft, I have to believe there's\nsome encryption magic happening when you upload and access these secrets. Once you've set the value in the secrets, you will not\nbe able to see it again and it will only be exposed to the CI agent.
\nI tried to log one of these secrets, and cleverly, it was censored in the logs (see below). Gone are the days of\nhaving to rotate your IAM keys because you accidentally logged it in your CI or Cloudwatch.
\n\nI'll come back to those AWS secrets shortly.\nFrom this point, all we have to do is deploy.\nI'm going to offer three suggestions:
\nI would argue that S3 is superior to Github Pages. The unfortunate part\nof Pages is that it can only serve from files in the repository, so you\nhave to commit your built files in order to host. However, Pages are\nfree forever, unlike S3 sites which will begin to cost if you start\nhaving significant traffic. If performance is a concern for you, look\nelsewhere as neither of these are going to be blazing fast.
\nI'd suggest going with Github pages for simplicity as you'll avoid\nsetting up an additional account (and potentially save $$).
\nMost sites I make are not under high demand, nor do they have many\nconcurrent users, so for my purposes, S3 storage is more than enough.
\nI also use Cloudflare to cache the assets, so the majority of sessions\ndownload assets off the Cloudflare CDN, rather than S3, so my usage\nstays very low for S3. This also has the benefit of using Cloudflare's\nsmart routing to make my Sydney hosted S3 bucket much faster for\ninternational users.
\nSee the example repository here: https://github.com/3stacks/github-actions-react-s3
\nFirst I'll quickly go through how to get your S3 bucket and IAM keys and be a bit responsible in the process.
\nCreate Bucket
and give it a url friendly name the same as the domain you will use for.Block all public access
checkbox{\n "Version": "2012-10-17",\n "Statement": [\n {\n "Sid": "PublicReadGetObject",\n "Effect": "Allow",\n "Principal": "*",\n "Action": "s3:GetObject",\n "Resource": "arn:aws:s3:::your-arn-here/*"\n }\n ]\n}
\nWith this policy, any user that queries can get any object in the bucket, so please, don't store anything private in there.
\nindex.html
We're going to start by making a policy that is our deployment policy for this bucket. It ensures that if the keys to\nan IAM user leak all you'll be giving away is access to that single bucket.
\n{\n "Version": "2012-10-17",\n "Statement": [\n {\n "Sid": "VisualEditor0",\n "Effect": "Allow",\n "Action": "s3:ListBucket",\n "Resource": "arn:aws:s3:::your-arn-here.io"\n },\n {\n "Sid": "VisualEditor1",\n "Effect": "Allow",\n "Action": ["s3:PutObject", "s3:GetObject", "s3:DeleteObject"],\n "Resource": "arn:aws:s3:::your-arn-here.io/*"\n }\n ]\n}
\nProgrammatic Access
Attach existing policies directly
AWS_ACCESS_KEY_ID
and copy the corresponding value from your newly created IAM userAWS_SECRET
Now your Github Action will pick these up in ci.yml
. Copy the contents\nof the deployment script from here: https://github.com/3stacks/github-actions-react-s3/blob/master/scripts/deploy.js\nto a directory (./scripts/
is what was defined in ci.yml
, but you\ncan change this if you prefer a different directory). Make sure you update\nthe S3 bucket name on line 24.
Your ci.yml
workflow should resemble the below:
ci.yml
name: CI\n\non:\n pull_request:\n push:\n branches:\n - master\n\njobs:\n build:\n runs-on: ubuntu-18.04\n\n steps:\n - uses: actions/checkout@master\n - name: Use Node.js 10.x\n uses: actions/setup-node@v1\n with:\n version: 10.x\n - name: Build\n run: |\n npm install\n npm run build --if-present\n - name: Deploy\n if: github.event_name == 'push' && github.ref == 'refs/heads/master'\n env:\n AWS_DEFAULT_REGION: ap-southeast-2\n AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}\n AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET }}\n run: node ./scripts/deploy.js
\nEnsure you set the region you would prefer in the deploy env.
\nNow we're done! Commit those changes, push it and you'll see the build\nrun and deploy your app.
\nVisit http://[bucketName].s3-website-ap-southeast-2.amazonaws.com/
to verify.
From now on, commit on master and your code will be deployed automatically.
\nSee the example repository here: https://github.com/3stacks/github-actions-react-pages
\nVisit https://github.com/[yourName]/[yourRepo]/settings
and scroll to the Github Pages section.\nHere you may enable github pages on the master
branch or gh-pages
, root folder (i.e. you build into root directory) or master\nbranch /docs. I prefer to use a separate branch as it's generally\nadvisable to keep your master branch clean of build files.
To enable the gh-pages
branch, the repo must already\nhave one. In your terminal, do the following:
git checkout -B gh-pages\ngit push origin gh-pages
\nBack in your browser, select the gh-pages
branch in the Pages\ndropdown (See below):
From here, deployment is fairly painless. Let's take advantage of the\nActions ecosystem Github is building and use: https://github.com/marketplace/actions/deploy-to-github-pages?version=1.1.2,\nan action written by James Ives.
\nFirst we have to generate a personal access token.
\nClick Generate new token
Select the appropriate scopes. We only need repo related scopes (below)
\nDo not share this key with anyone. It has access read/write access all your repositories *
\nAdd the secret as per the Storing and using the secrets section\nabove, calling your access token secret GITHUB_ACCESS_TOKEN
Back in ci.yml
,
name: CI\n\non:\n pull_request:\n push:\n branches:\n - master\n\njobs:\n build:\n runs-on: ubuntu-18.04\n\n steps:\n - uses: actions/checkout@master\n - name: Use Node.js 10.x\n uses: actions/setup-node@v1\n with:\n version: 10.x\n - name: Build\n run: |\n npm install\n npm run build --if-present\n - name: Deploy to GitHub Pages\n uses: JamesIves/github-pages-deploy-action@1.1.3\n if: github.event_name == 'push' && github.ref == 'refs/heads/master'\n env:\n ACCESS_TOKEN: ${{ secrets.GITHUB_ACCESS_TOKEN }}\n BRANCH: gh-pages\n FOLDER: build
\nOur secret and other required arguments will be provided to the Pages\nDeploy action using the env
key.
Due to the way the routing is done in github pages, assets referencing\n/
will go to the root of your Pages (e.g. https://3stacks.github.io
).\nThis means none of the assets in CRA will be loaded. To get around this,\nin your package.json
, add "homepage": ".",
. This will make it resolve\ncorrectly.
Now we're done! Commit those changes, push it and you'll see the build\nrun and deploy your app.
\nVisit http://[yourName].github.io/[repo-name]
to verify.
From now on, commit on master and your code will be deployed automatically.
\nCOMING SOON - This section is not complete
\nGithub Actions also supports using specific Docker containers\nfrom Dockerhub. So if you have complicated dependencies, you can\nchoose to utilise this option. Use the uses
key and give it a path\nin the format of: docker://[image]:[tag]
https://help.github.com/en/articles/configuring-a-workflow#referencing-a-container-on-docker-hub)
\n", "url": "https://boyleingpoint.com/blog/posts/github-actions-for-web-apps", "title": "Github Actions for web apps", "date_modified": "2019-08-12T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/software-i-actually-believe-in", "content_html": "I do a lot of complaining about privacy and annoying products, but there\nare some that I believe really do a good job. These are some companies\nwhose products and missions are appealing enough I'd want to work there.
\nYNAB is a very interesting take on budgeting. I used to swear by my old\nway of using a spreadsheet, but it sort of falls apart when your pay\nis irregular (like for self-employed people or freelancers with variable\nincome). You can connect your bank accounts for automatic transaction\nfeeds but I prefer doing it manually as it seems to make you more mindful\nabout your spending.
\nI'm a real metric head, so I appreciate some good graphs.
\nAge of money tells you how long between getting paid do you spend your\nmoney. It's very encouraging to see yourself breaking the cycle of\npay-cheque to pay-cheque.
\n\nNet worth is pretty self explanatory. It tracks your assets versus your\ndebts and gives you a nice net worth graph over time.
\n\nYou also get a categorical breakdown of your spending which you can\nclick into to see more specific information about each category
\n\nThe bulk of their blog posts are not YNAB specific, but include general\nadvice for budgeting, so if you're struggling, it may be helpful for you.
\nIf that sounds appealing, there's a link below which includes a referral\n(if you sign up I get a free month. If you aren't cool with that,\njust search for YNAB). They offer a month long free trial if you feel\nlike giving it a shot.
\n\nAccording to their website, Cloudflare now powers nearly 10 percent of\nall Internet requests. I've been using them for a few years now and\nI'm still in awe of them. First of all, when I started using them I was\nstill paying for SSL certificates, then here comes this start-up that\noffers DDoS protection, SSL and caching and it's free... Where's the\ncatch? I do find it somewhat suspicious that they're able to offer these\nservices for free. Presumably the money they make off enterprise\naccounts offsets the usage at the free tier.
\nThe DNS settings are really easy to use too. I use the analytics on this\nsite and it seems to block a few threats a week.
\nThey are now offering domain registrations which I haven't taken advantage\nof, but they seem to be cheaper than your run of the mill registrar.
\nThey also do a lot of very interesting technical writing. Curious why\ntheir office has a wall covered in lava lamps?
\nTechnical version - https://blog.cloudflare.com/lavarand-in-production-the-nitty-gritty-technical-details/\nNon-technical version - https://blog.cloudflare.com/randomness-101-lavarand-in-production/\nArticle about it - https://www.fastcompany.com/90137157/the-hardest-working-office-design-in-america-encrypts-your-data-with-lava-lamps
\nCheck them out: https://www.cloudflare.com/
\nPassword managers are certainly rising in popularity and it's a good thing.\nThe password is a very flawed authentication method especially when you\nre-use the same weak password across multiple sites. If you can instead\nremember one very strong password, you'll be able to generate strong,\nunique passwords for every service you use. They also now have built in\nsupport for the Google Authenticator protocol with TOTP tokens.
\nI really like the mobile and desktop apps and they have recently\nreleased a browser only client. Along with their provided cloud sync\noptions, they also offer personal cloud storage syncing.
\nYou can also enable travel mode for when you're overseas which stops\nsyncing sensitive vaults.
\nI use the shared vaults a lot to share with coworkers.
\nI'm planning on keeping this list updated should anything change, so\nkeep your eyes on this post.
\n", "url": "https://boyleingpoint.com/blog/posts/software-i-actually-believe-in", "title": "Software I actually believe in", "date_modified": "2018-12-21T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/case-study-for-iterative-development", "content_html": "Agile development is something that has evolved to become a bit of a joke in the software industry,\nmuch like an obscure gag amongst friends that evolves over time to the point where the humour is\nincomprehensible to anyone on the outside. Today, we may find ourselves being handed little laminated\ncut-outs with clipart of t-shirts on it and being implored to stick it on the wall, playing estimate\npoker, or writing love letters to team members in a retrospective meeting. In my experience, it seems\nto be common understanding amongst programmers that the ceremonies associated with Agile err on the\nside of bizarre, but businesses love it. In my estimation, it’s the idea that they are fostering a\ncollaborative environment. Whether or not it’s just an illusion is another story, but in the age of\nBlockchain, chatbots, and machine learning, Agile is king.
\n"Agile" in its current sense appears to be derived from the Agile manifesto, however, agile practices have roots\nthrough the last 4 decades of programming history. Recently I read the\nMythical Man Month (Brooks, 1975) and in it Brooks extolls the virtues of things like\ndisposable prototypes, testing as you build, and always having a working program.
\nOne of the most recognisable and user-friendly explanations of this concept is "The Agile Bicycle"\nillustrated by Henrik Knilberg
\n\nThis is a great example of delivering a minimum viable product (MVP). There are many benefits to this\nmethod:
\nRegardless of how rough around the edges your product is, if it is functional, then people can use\nit. It may not have the appeal to gain significant traction, but you can start getting at least some\nROI, and - perhaps more importantly – user feedback. If a product is fundamentally flawed, it\nshould be visible at any stage. According to Brooks, an incremental build method is better because:
\nThe most important part of that is that while we may not deliver the full feature set at the\ninitial release date, at the very least, we’re not going to be giving people a car without a steering wheel.
\nSo, how does a company selling pre-packaged meals relate to software MVPs?
\nI’ve been using them for around a year. I picked them because unlike similar competitors, they offered\nmeals with higher calorie counts for a similar price point. My first delivery came in an unmarked\nStyrofoam box. Styrofoam is good at insulating contents; however, it requires specialised machinery\nto recycle and takes untold millions of years to degrade, it’s not a great material. The meals came\nin take-away style containers with a sticker slapped on which were easily broken in transit and they\nwere all frozen. On the technical side, subscriptions were not manageable by the user and had to go\nthrough customer service, which added some friction. It wasn’t a mind-blowing experience, but the\nmeals all tasted good and most importantly, the business model worked.
\n\nOver the last 12 months I’ve observed various improvements to their offering.
\nWhile people starting to use them now will see the last year of enhancements as the norm,\npeople who have been using it for a longer period will have gradually had improved\nservice, thus increasing satisfaction. Rather than overreaching and increasing the\nrisk of being crushed by their overhead, My Muscle Chef took an iterative approach\nand gradually built a loyal base of customers which enables further innovation.
\nIn my eyes, iterative development is inarguably superior to traditional waterfall\nproject management where oftentimes budget, schedule and feature set are inflexible.\nAs the saying goes, "you don’t know what you don’t know", and as such, progressive\ndiscovery will often prove many of your initial assumptions incorrect. It’s very\nrefreshing to see companies with more tangible products embracing Agile principles\nand prospering. As they say, the proof is in the pudding.
\nTo be clear, I am in no way affiliated with this company, I just like eating their\nfood. If you do end up signing up, consider using my referral code (S1HKD51IM)\nand we’ll both get $15 credit. Love those free meals.
\n", "url": "https://boyleingpoint.com/blog/posts/case-study-for-iterative-development", "title": "My Muscle Chef: A case study for iterative development", "date_modified": "2018-11-28T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/mac-miller", "content_html": "\n\n\nI was drowning but now I'm swimming
\n
In an interview with Zane Lowe, Malcolm said "Do you ever feel\ninvincible? I lived a certain life for 10 years and faced almost no\nreal consequences. I had no version of the story that didn't end up\nwith me being fine". He had recently been arrested for crashing his\ncar while under the influence of alcohol. He took this as a wake-up\ncall, but it appears it was too little too late. Perhaps if he had faced\nconsequences sooner, he wouldn't have been allowed to fall so far\ninto the hole.
\nFame is a double-edged sword and for Malcolm being 20 years old with a\nBillboard topping debut album, he was thrown into the spotlight and\nthings really didn't stop for him since then. Imagining myself at\n20 becoming wealthy and famous, I doubt there's any chance I would\nexercise any level of restraint. At that time, being known as a\nproducer of "frat rap", it almost seems like a self-fulfilling prophecy\nthat that would lead to out of control partying and substance abuse.\nThe entertainment industry has a habit of dragging people in and\nbeating the shit out of them. When you consider the story of Avicii\nand the adversity he faced essentially being forced to tour and perform\neven when he was begging his manager to cancel the shows, it's easy to\nsee why so many people don't make it. Other artists such as Deadmau5\nand Earl Sweatshirt were able to see the warning signs and they\ntook breaks to take care of themselves. This year Earl Sweatshirt cancelled\na tour to Australia following the death of his father. With any luck he\nwill emerge having dealt with his grief healthily and be better for it.
\nNot everyone is so lucky, though. Look no further than the\n27 club; a sprawling list of\npeople who likely garnered significant fame in their late teens or early\n20's but didn't manage to see the decade out. It is a systemic issue\nthat doesn't have a clear root cause. Is it the idolisation of relatively\nyoung adults? Or is it a result of an abusive industry that chews people\nup and spits them out?
\nIt was as early as 2012 that Malcolm spoke about how he didn't want to\ndie of an overdose, but how sobriety was just boring. In the few years\nafter his career took off, it was clear to his fans that he was not\ndoing well. 2015 marked the release of GO:OD AM, he took the\nopportunity to check in and let everyone know he was doing alright.\nThe dark period was seemingly over. Contrary to what most people will\nhave you believe, there is a voluntary element to depression. There's a\nvery fine line between living with depression and living in hell. If\nyou allow your self-loathing tendencies to consume you, you will be in\nhell. Alcohol is a well known depressant, but it's hardly the only thing\npeople commonly abuse. I don't imagine anyone comes out of the other\nside of an opiate high and thinks "wow, that was an awesome time, I\nfeel great about doing that". Giving in to substances is just one way\nwe sabotage ourselves.
\nLife is full of peaks and troughs and it's a true tragedy that Malcolm\n(and so many other people for that matter) didn't make it to their next\npeak. For me, this is a lesson about the fragility and the tragedy of\nlife. The pain that we feel from his passing will eventually be eclipsed\nby the gifts of his music and positive energy. The world has changed as a\nresult of what Malcolm accomplished during his short life. I think it's\nwell worth waking up in the morning and being a part of it.
\nI don't think I'll ever understand why his death affected me so much.\nIt's quite bizarre how connected we as fans were able to feel with him,\nnever having met him, but I think that's just a testament to the\nartist and person he became. From a frat rapper to a soulful musician\nexpressing himself honestly and uninhibited, he touched fans and artists\nalike. We are truly fortunate to have had the opportunity to listen to\nhis magnum opus Swimming.
\nI encourage you to watch the Mac Miller: A celebration of life concert and to donate to the Mac Miller circles fund
\nRest in peace Mac Miller.
\n", "url": "https://boyleingpoint.com/blog/posts/mac-miller", "title": "Mac Miller", "date_modified": "2018-11-07T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/converting-wordpress-site-to-static", "content_html": "The last iteration of this website was a truly insane infinite scrolling\ncarousel that was very overwhelming to anyone who dare behold it, so with this\nversion (which recently had its first birthday) I decided to go with a much\nmore content focused design since I actually wanted to start writing more\npublicly. There's also something to be said about not confusing people or\nforcing them into epileptic fits.
\nAt the time, I didn't want to sink a lot of time into it, so WordPress was\nidentified as the path of least resistance. I used Bedrock by Roots to version\ncontrol my plugins and WordPress with Composer. It was working well and was\nquite fast (for a WordPress website), but it still suffered from a fairly\nfundamental issue of not being able to version control content. WP apologists\nmight tell you to store your database dumps in your repo, but to them I say;\n"yeah, nah". If you ever have the misfortune of looking at a WP database dump,\nyou'll realise there’s about a billion lines of muck which is totally\nirrelevant to the content and composition of your website and I don’t\nparticularly like the idea of storing my users table in a public git repository\nanyway. In spite of my whinging, the version controlled content pain point was\nmore of an under-the-tongue ulcer type of pain than a broken arm so I didn't\nworry about it. One day I made the mistake of upgrading the WP version on my\nserver and I hadn't copied the install to my local, so there was a lot of out\nof sync content. So you can imagine I was pretty happy when I found out my\nlogin no longer worked, I couldn’t reset my password and changing the password\ndirectly in the database didn't work. I took an sql dump of the database and\nloaded it into my local only to find the Advanced Custom Fields don’t appear\nto be stored in the database, so when I salvaged the content it was totally\nbroken.
\nThen it hit me. What if I get a JSON dump of my posts from the database and\nturn that into a static version? So, what output format would be most suitable\nfor an archive of text posts?
\nMarkdown was invented by notable 'f-word' writer John Gruber in 2004 and it\nhas since become a staple in the development world. I chose to use Markdown\nas the output because it provides simple shorthands to represent markup so I\nknew I could get tidy archiving in Github that would be nicely rendered as\nhtml in the web view, but the posts would still be readable (and writable for\nfuture posts) when looking at the source. I created a node package for\ngenerating an archive and published it to npm in the\nhopes that it might address the problem for other people too.
\nNow I have my posts nicely sorted and stored in a repo,\nbut the problem with generating an archive of Markdown files is then you just\nhave an archive of Markdown files to deal with.
\nThe website is built with the static site generator "Gatsby", so all pages\nare React components which really adds a lot of flexibility. For example, when\ngenerating blog post components I can make the title render as a link to the\nblog post slug but only when it appears on the front page.
\nThe ingestion strategy is to add the blog-posts repository as a submodule so\nI can then update and push those independently. Then, at compile time, I would\nread the archive of blog posts and generate:
\nThe script that is responsible for this is really something to behold (you can\nsee that here).\nThe process is such that all markdown files are grabbed from the archive, then\nfor each post, the script will parse out a metadata table in the top of the\nfile that has the post title and whether or not it is a draft.\nThat post is then passed to the markdown renderer and we generate a blog post\ncomponent with that rendered content. That blog post component is then given\nits own page component and it’s stitched onto the aggregate blog post list.\nThe blog post list is then parsed out into pages which are output as components\nand voilà. I suppose if there’s a gap for it, I could publish a "WordPress\nMarkdown archive to React static site" package, but it may be a bit too niche.
\nThe end result is an overall slimmer repository since all of the blog posts\nare stored in a different repository and the generated pages are not committed\nwhich lends itself perfectly to an automated deployment service. It also\nallowed for much less human intervention in the creative process.
\nThe main caveat I’ve discovered in this transition is that I didn't have a\nsolution for porting assets (such as embedded images) to the markdown archive.\nCurrently, any embedded images will 404 until they are added manually.\nThis definitely isn't ideal and if I ever get a chance I plan to package all the\nlinked assets down into each blog post.
\n", "url": "https://boyleingpoint.com/blog/posts/converting-wordpress-site-to-static", "title": "Converting a WordPress site to a React static site", "date_modified": "2018-01-08T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/project-estimations-made-easy", "content_html": "I recently published a post on the Stak Digital\nengineering blog about our new app Guesstimate and project estimation in general. To read the post, head on over to the\npost here: https://stak.digital/blog/project-estimations-made-easy
\n", "url": "https://boyleingpoint.com/blog/posts/project-estimations-made-easy", "title": "Project estimations made easy", "date_modified": "2017-12-19T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/responsive-definition-lists-solved-by-flexbox", "content_html": "Consider the definition list. Here's a simple example. The standard behaviour would have the term and definition both as block level elements, naturally stacking down like so.
\nBut what if we want the term and definition to sit inline? This usage is semantically a dl, but traditionally, this has been a serious pain in the ass if you want consistent spacing between the terms/definitions. The image below exhibits a compromise I made with the designer on a previous project. Making the dt/dd inline-block works to a certain degree, however, when setting widths explicitly you will have serious issues going down the breakpoints. The display:block
span just forces the content to stay in it's respective line. This, however, is not correct usage, as a dl
is only supposed to have dt
or dd
elements inside it. EDIT: Since working on this project, it looks like we're now permitted to wrap a dt+dd
group in a div to control flow. So how can flexbox help us here?
In Agander, I made my first forays into colour themes. In a very simple approach, I have two colour schemes (light and dark) which are displayed on the body as a class (scheme-light and scheme-dark) respectively. The general approach for styling a component is as such: _button.scss
// Define base component styles (e.g. sizing/positioning)\n.button {\n border: 1px solid;\n padding: 6px 5px;\n}\n\n// Dark Color scheme styles\n.scheme-dark {\n .button {\n background: white;\n border-color: white;\n color: black;\n }\n}\n\n// Light Color scheme styles\n.scheme-light {\n .button {\n background: black;\n border-color: black;\n color: white;\n }\n}
\nAlthough this is quite lightweight, there are still issues.
\nEnter the CSS Variable (the hero we need) CSS Variables are defined like so:
\n:root {\n // Initialise the variable\n --primary-color: pink\n}\n\np {\n color: var(--primary-color); // it's pink, baby.\n}
\nThe var
function also takes a second argument which is an initial/fallback value.
p {\n color: var(--primary-color, red);\n}
\nCSS Variables follow block scoping principles, so, variables defined in :root
are considered to be global variables (but may be overwritten inside specific components) and variables defined in any other element are scoped to that block of styles. This is broken down very nicely on a recent Smashing Magazine article.
I recently wrote a library to ingest variable names and values and spit them onto the root element (see the package) The idea is that each theme would have all relevant variables defined in objects like so:
\nconst viewState = {\n currentTheme: 'darkScheme'\n}\n\nconst themes = {\n darkSheme = {\n 'primary-color': {\n hex: '#FFF'\n }\n },\n lightScheme: {\n 'primary-color': {\n hex: '#000'\n }\n }\n}
\nAnd then when the currentTheme changes:
\nimport syncVars from '@lukeboyle/sync-vars';\n\nfunction updateCssVariablesWithCurrentScheme(colorScheme) {\n syncVars(themes[colorScheme]);\n}\n\n// if we call that function with 'darkScheme'\nupdateCssVariablesWithCurrentScheme('darkScheme');\n\n<html style="--primary-color: #FFF;"></html>
\nSo, how does this help? For one thing, with this approach, I no longer have to worry about adding the colour scheme classes to the body, and I don't have to do any hacky overrides, etc. _buttons.scss
now looks like this:
.button {\n border: 1px solid var(--text-color-var);\n padding: 6px 5px;\n background: var(--button-background-color-var);\n color: var(--text-color-var);\n}
\nLooking forward, this approach also means that custom colour themes are very nearly in reach. It also means that colour schemes could be changed on the fly. The user could have a colour swatch tool and be previewing their theme changes live. Taking it even further, it means that the colour schemes no longer need to be a part of the codebase. It could just as easily be a JSON file on the server and changes could be flexibly pushed. Why is this exciting? Say it's Christmas time and you want to get into the spirit of things... With a few string replacements you have a temporary festive theme to force upon your users.
\nSites or apps could have buttons to activate color blind mode and specific 'problem' colours could be swapped out for friendly colours. Additionally, high contrast modes would be a breeze.
\nUsers could activate alternate modes for websites to get a different experience.
\nCSS variables are getting me really excited because it's the first minimal overhead approach to theming in front-end only applications. This is something that will reward well structured stylesheets and result in a better experience for the user. I am looking forward to rolling out custom themes in Agander and finally getting around to making the flat UI theme I have wanted to make for some time.
\n", "url": "https://boyleingpoint.com/blog/posts/css-variables-a-case-study", "title": "CSS Variables: A Case Study", "date_modified": "2017-03-26T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/css-buttons-solved-with-flexbox", "content_html": "There are two commonly accepted approaches to making buttons with CSS, but both of them are a little bit shit. What if I told you there was another way? (morpheus.wav
)
<style>\n .button-padding-approach {\n font-size: inherit;\n -webkit-appearance: none;\n border-radius: 0;\n border-style: solid;\n border-width: 0;\n cursor: pointer;\n font-weight: normal;\n line-height: normal;\n margin: 0;\n position: relative;\n text-align: center;\n text-decoration: none;\n display: inline-block;\n padding: 1rem 2rem 1.0625rem 2rem;\n font-size: 16px;\n background-color: #999;\n color: #000;\n max-width: 170px;\n }\n</style>\n\n<div>[A Button](#) [A Button that breaks to two lines](#)</div>
\nThis approach works okay, and it's good for multi-line (buttons where the marketing team sanctioned too much copy) text. The problem with typography, is that glyphs can have descenders (as in y and j) which push the bottom of the bounds down. So if you want to properly vertically center your text you have to baby the padding so much that it becomes too much of a pain in the ass. The padding on the above buttons is padding: 1rem 2rem 1.0625rem 2rem;
. 5 significant figures for bottom padding? I don't think so.
<style>\n .button-lineheight-approach {\n -webkit-appearance: none;\n border-radius: 0;\n border-style: solid;\n border-width: 0;\n cursor: pointer;\n font-weight: normal;\n line-height: normal;\n margin: 0;\n position: relative;\n text-align: center;\n text-decoration: none;\n display: inline-block;\n font-size: 16px;\n background-color: #999;\n color: #000;\n max-width: 170px;\n height: 50px;\n line-height: 50px;\n padding: 0 2rem 0;\n }\n</style>\n\n<div>[A Button](#) [A Button that breaks to two lines](#)</div>
\nThis approach is a lot less hands on for the vertical alignment. You set height: 50px;
and line-height: 50px;
and voila, perfect vertical alignment. Until you need two lines and then it bleeds out of the button because you thought a CTA would never be more than 3 words long. At this point you're forced to either increase the button width, or reduce your font-size and neither are very designer friendly.
<style>\n .button-flexbox-approach {\n display: flex;\n justify-content: center;\n align-items: center;\n -webkit-appearance: none;\n border-radius: 0;\n border-style: solid;\n border-width: 0;\n cursor: pointer;\n font-weight: normal;\n line-height: normal;\n margin: 0;\n position: relative;\n text-align: center;\n text-decoration: none;\n padding: 1rem 2rem 1.0625rem 2rem;\n font-size: 16px;\n background-color: #34495e;\n color: #fff;\n }\n .button-flexbox-approach:hover {\n color: #fff;\n }\n .flex-button-container {\n display: inline-block;\n }\n</style>\n\n<div>\n <div class="flex-button-container">[A Button](#)</div>\n\n <div class="flex-button-container" style="max-width: 170px;">\n [A Button that breaks to two lines](#)\n </div>\n</div>
\nThe main caveat of this approach is that the button now needs a container. The container doesn't need anything fancy on it, just display: inline-block;
to allow the content to naturally scale, and if you want to restrict how large the button can be, add max-width: x;
Other than that, this approach is pretty bullet-proof from my testing and I like it a lot.
I was recently given the job of rebuilding a particularly bad landing page from an external company. Apart from class names, styles and markup being all over the place, there was a particularly obnoxious form validation script sitting in the middle of the page. An excerpt of the script can be seen below, and this documents the process I took when reviving the JS side of things.
\n<script type="text/javascript">\n\n var flagValidation;\n\n /* validation for 'phone number' */\n function PhoneNumberValidation() {\n var phoneNum = document.getElementsByName("Phone")[0].value;\n var normalPhonepattern = /^[0-9\\s\\-\\+]{6,14}$/g;\n\n if(!normalPhonepattern.test(phoneNum))\n {\n flagValidation = false;\n document.getElementById("PhoneValidation").innerHTML = "Only numbers, '-' and '+' characters are accepted"\n }\n else\n document.getElementById("PhoneValidation").innerHTML = ""\n }\n\n function SubmitDetails(){\n flagValidation = true;\n PhoneNumberValidation();\n\n return flagValidation;\n }\n\n</script>
\nSo what is wrong with this picture? - There's no reason for this to be a script tag on the page, let's make it an external script - Mutation - Basing the validation on mutating the variable to false should not be the responsibility of these functions - The flagValidation variable being globally scoped and mutated/used in several places leaves a lot of places for it to fail when making changes - The functions are doing too much. When looking at it from a functional standpoint, they should just be returning a bool, and a final validate function can follow up. - Repeating code (e.g. document.getElement...
) unnecessarily When you allow your functions to be purely functional, this function...
function PhoneNumberValidation() {\n var phoneNum = document.getElementsByName("Phone")[0].value;\n var normalPhonepattern = /^[0-9\\s\\-\\+]{6,14}$/g;\n\n if(!normalPhonepattern.test(phoneNum))\n {\n flagValidation = false;\n document.getElementById("PhoneValidation").innerHTML = "Only numbers, '-' and '+' characters are accepted"\n }\n else\n document.getElementById("PhoneValidation").innerHTML = ""\n }
\nCan become...
\nfunction isPhoneNumberValid() {\n const phoneNumber = document.getElementsByName("Phone")[0].value;\n const phoneNumberRegex = /^[0-9\\s\\-\\+]{6,14}$/g;\n return phoneNumberRegex.test(phoneNumber);\n}
\nMuch prettier, right? Once we've refactored all of those individual functions, the main input validation function looks like this:
\nfunction validateFormInputs(event) {\n\n let isFormValid = true;\n const phoneNumberFeedback = document.getElementById("PhoneValidation");\n\n if (isPhoneNumberValid()) {\n phoneNumberFeedback.innerHTML = '';\n } else {\n phoneNumberFeedback.innterHTML = "Only numbers, '-' and '+' characters are accepted";\n isFormValid = false;\n }\n\n if (isFormValid) {\n contactForm.removeEventListener('submit', validateFormInputs);\n return true;\n } else {\n event.preventDefault();\n }\n\n}
\nIt's cleaner, sure, but I'm still not okay with using and mutating that isFormValid
variable and innerHTML
appearing every other line. Let's take it further. Let's outsource the error message work to a utility function.
function generateErrorMessage(element, message) {\n return element.innerHTML = message;\n}\n\n// So we use that like this...\n\nif (isPhoneNumberValid()) {\n generateErrorMessage(phoneNumberFeedback, '');\n} else {\n generateErrorMessage(phoneNumberFeedback, 'Cannot be empty');\n isFormValid = false;\n}
\nThe next step is to stop mutating that validity flag. To do this, I'm going to bundle all the validation methods into an object and then reduce that to return an isFormValid bool.
\nconst fields = {\n phoneNumber: {\n isFieldValid: function() {\n const phoneNumber = document.getElementsByName("Phone")[0].value;\n const phoneNumberRegex = /^[0-9\\s\\-\\+]{6,14}$/g;\n return phoneNumberRegex.test(phoneNumber);\n },\n userFeedbackElement: document.getElementById("PhoneValidation"),\n errorMessage: "Only numbers, '-' and '+' characters are accepted"\n }\n};\n\n// Generate an array from the keys of the methods object and reduce\nObject.keys(validationMethods).reduce((acc, curr) => {\n // do stuff\n}, true);
\nIf you're not familiar with Array.reduce
, it will iterate over each item in the array and allow you to process them. The arguments are acc
(accumulative) and curr
(current). The idea is, we're going to execute each function and then show/hide error messages accordingly. The function now looks like this:
function validateFormInputs(event) {\n\n const isFormValid = Object.keys(fields).reduce((acc, curr) => {\n const currentField = fields[curr];\n\n if (currentField.isFieldValid()) {\n generateErrorMessage(currentField.userFeedbackElement, '');\n return acc;\n } else {\n generateErrorMessage(currentField.userFeedbackElement, currentField.errorMessage);\n return false;\n }\n }, true);\n\n if (isFormValid) {\n contactForm.removeEventListener('submit', validateFormInputs);\n return true;\n } else {\n event.preventDefault();\n }\n\n}
\nThis implementation is clearly a case-by-case basis. It works for my particular scenario because there's only one validation condition for each field. If there were more rules, the approach would need to be changed to compensate and it may not be able to be as dynamic. It should also be noted that this is a fairly over-engineered solution. I wouldn't say that the original approach is wrong, but my approach looks at the same problem from a functional programming standpoint and I believe it is much cleaner and much more robust. For a view of the entire file, see my gist at https://gist.github.com/3stacks/c5c49904684e4ddec48aa017ab912db9
\n", "url": "https://boyleingpoint.com/blog/posts/functional-form-validation-in-java-script-aka-inheriting-bad-java-script", "title": "Functional Form Validation in JavaScript (aka: Inheriting bad JavaScript)", "date_modified": "2017-01-30T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/automating-css-regression-testing-with-argus-eyes-phantom-js", "content_html": "I have had my eyes on Argus Eyes (http://arguseyes.io/) for quite some time and now I have the time to implement it at work. The interface is rather simple. You define your browser breakpoints, the pages, and the parts of the pages you wish to capture. All components
are defined with a name and a selector. For example, ".site-nav" or "body". You define all components in the components array, but then you can cherry pick which ones are used on each page. Such as, homepage may use the hero component, but about may not.
{\n "sizes": ["320x480", "1280x768", "1920x1080"],\n "pages": [\n {\n "name": "homepage",\n "url": "http://localhost:3000/",\n "components": ["hero", "all"]\n }\n ],\n "components": [\n {\n "name": "all",\n "selector": "body"\n },\n {\n "name": "hero",\n "selector": ".hero"\n }\n ]\n}
\nSince I'm generally against installing npm packages globally (and you probably should be too), I define my capture scripts in package.json
. This presents the first issue: The usage of Argus is like so: argus-eyes capture <branch-name>
But this of course only names the capture for you. It's your responsibility to switch branches. So the workflow becomes:
develop
branchargus-eyes capture develop
(this is the baseline)feature-branch-name
argus-eyes capture feature-branch-name
argus-eyes compare develop feature-branch-name
Argus then uses blink-diff to compare the two sets of screenshots you just captured (note, you shouldn't change your config between captures) and outputs any screenshots in which there are visual differences. For example, bumping the padding on your nav will result in something like this. It's not a super intelligent representation, however, it does quickly show you that something is wrong. In my opinion, the current workflow makes it almost worth not bothering. So how do we make it a 1 step test?
\nI am attempting to simulate this entire process in node. For this, we'll need a few things.
\nI've tried to make the node script as pure as possible. I created a file called argus-test.js
. In that, there is an individual function for each git action. First is a function to initialise the repo.
/**\n * @param {string} path - path to the repository (.git)\n * @returns {Promise}\n */\nfunction openRepository(path) {\n return Git.Repository.open(path);\n}\n\n// Path is based on current working directory\nconst repoPath = require("path").resolve("./.git");\n\nopenRepository(repoPath).then(...)
\nopenRepository returns a Promise which has the reference to the repository in it. To act on the repository, we need to keep track of this returned value. Since all of the nodegit functions return Promises, we're going to be seeing a lot of then
.
// Initialise this let to keep track of which branch we're on\nlet featureBranch;\n\n/**\n * @param {Repository} repo - The reference to the repository object\n * @returns {Promise}\n */\nfunction saveCurrentBranch(repo) {\n return repo.getCurrentBranch();\n}\n\nopenRepository(repoPath).then(\n repo => {\n saveCurrentBranch(repo).then(repoName => {\n featureBranch = repoName;\n });\n },\n err => {\n // Usually would only happen if you give it the incorrect path\n throw new Error(error);\n }\n);
\nNow we have a reference to the current feature branch, we've got that stored for later. In the function where we set the featureBranch variable, we're going to execute our capture functions.
\nshell.exec(\n `node node_modules/argus-eyes/bin/argus-eyes.js capture ${featureBranch}`\n);\n\n// Successful output will say something like "12 screenshots saved to .argus-eyes/feature-branch-name"
\nThis is the tricky part. We have to switch branch to whatever the base is (develop in this case). This is the biggest hurdle. Although the function is simple, if there are any uncommitted changes, the function may fail. Probably best to warn the user to make sure all changes are committed or stashed first.
\n/**\n * @param {Repository} repo - The reference to the repository object\n * @returns {Promise}\n */\nfunction switchToDevelop(repo) {\n return repo.checkoutBranch('develop');\n}\n\nswitchToDevelop(repo).then(...)
\nAfter successfully changing to develop, we still have to capture the branch and then compare them, which is done like so:
\nshell.exec('node node_modules/argus-eyes/bin/argus-eyes.js capture develop');\n\nshell.exec(\n 'node node_modules/argus-eyes/bin/argus-eyes.js compare develop ' +\n featureBranch\n);
\nIf Argus detects any screenshots over the threshold for change, it will save the diff in a folder like .argus-eyes/diff_develop_feature_branch_name
For the full file in action, check out this gist: https://gist.github.com/3stacks/0976ef8a84c50c6096aea09dbbbebd88
To improve this process, it might be an idea to save the baseline diff in the repo and then overwrite it whenever you push to that branch. This would eliminate the need to switch over the branches.
\n", "url": "https://boyleingpoint.com/blog/posts/automating-css-regression-testing-with-argus-eyes-phantom-js", "title": "Automating CSS regression testing with Argus Eyes (PhantomJS)", "date_modified": "2016-12-14T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/local-storage-manager-version-2-1-is-out-now", "content_html": "The latest version of local-storage-manager has had the internal interface greatly improved for tidiness and best practice, and now has the new Namespace feature. Traditionally, you would have to store your data like so:
\nconst appState = {\n key1: {...},\n key2: {...}\n}
\nand set the data like this:
\nlocalStorageManager.set('appData', appState);
\nThe issue with this is you may not want key1
and key2
to be grouped together but don't want them to be tossed straight into the local storage. With namespaces you can do this:
localStorageManager.set('key1', key1, 'myAppState');\nlocalStorageManager.set('key2', key2, 'myAppState');
\nThis makes it easier to access all of your data at once while still keeping those keys theoretically separate. When accessing the namespaced data, you simply add the namespace as the second arg like so:
\nlocalStorageManager.get('key1', 'myAppState');
\nThe app is now more robust internally and can handle cases of missing data better. It also uses the getItem
and setItem
methods internally instead of accessing the localStorage directly. To get started, install via npm with npm install @lukeboyle/local-storage-manager
See the npm page with documentation and in depth instructions at - https://www.npmjs.com/package/@lukeboyle/local-storage-manager
A quick-start guide for running Karma tests for Chrome in Travis CI. When you run Travis on a Node.js project, Travis will - by default - run npm install
and then npm test
. I first ran into the issue in an Angular project that had tests triggered in the prepublish
command. My CI build failed and I decided to remove the prepublish hook and change the name of my test script until I had the time to come back. For months I've been avoiding the issue, but I have finally solved it. The Karma docs suggest that you can run the tests in Firefox with the --browsers flag (see https://karma-runner.github.io/0.8/plus/Travis-CI.html). Travis has since updated so that Chrome can be loaded into the environment. For this to work, you'll need to make changes to your travis.yml
file and your karma config file.
travis.yml
Note that I'm using only latest node as that is the requirement for me
\n language: node_js\n node_js:\n \\- "node"\n before_script:\n \\- export CHROME_BIN=chromium-browser\n \\- export DISPLAY=:99.0\n \\- sh -e /etc/init.d/xvfb start
\nThe before_script is the special part, which points travis in the right direction for running Chrome. The last two lines are addressed in the karma docs linked above. Personally, I am using a separate karma config file, and I want to make the changes within that config file to keep my test script clean. My test script is:
\n"test": "karma start karma.config.js"
karma.config.js
const configuration = {\n files: [{ pattern: 'tests/**/**/**.*', watched: true }],\n customLaunchers: {\n chromeTravisCi: {\n base: 'Chrome',\n flags: ['--no-sandbox']\n }\n },\n frameworks: ['mocha'],\n browsers: ['Chrome'],\n failOnEmptyTestSuite: true,\n singleRun: true\n};\n\nif (process.env.TRAVIS) {\n configuration.browsers = ['chromeTravisCi'];\n}\n\nmodule.exports = function(config) {\n config.set(configuration);\n};
\nLuckily, Travis sets the process env to TRAVIS and if we check for this, we set the configuration browsers to ['chromeTravisCi'] which is defined in the customLaunchers. Have whatever pre-processors you need in the configuration object and it should work fine when you deploy.
\n", "url": "https://boyleingpoint.com/blog/posts/running-karma-tests-for-chrome-in-travis-ci", "title": "Running Karma tests for Chrome in Travis CI", "date_modified": "2016-10-13T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/jsx-in-vue-js", "content_html": "I've recently been experimenting with using jsx in Vue, the Vue jsx plugin for babel and using that instead of the standard template pattern. Since there are really not any official docs for the plugin, I'm going to run through a quick usage guide.
\nFor my project I'm using Webpack and just default npm scripts. Whatever your choice for build process the important part is what you have configured your babel config or .babelrc with.
\nplugins: [\n 'transform-runtime',\n 'transform-vue-jsx'\n],\npresets: ['es2015']
\nThat's the basic requirement for getting started. To install those, run:
\nnpm install -D babel-plugin-transform-runtime
npm install -D babel-plugin-transform-vue-jsx babel-helper-vue-jsx-merge-props babel-plugin-syntax-jsx
npm install -D babel-preset-es2015
The official repo for the Vue jsx is located here: https://github.com/vuejs/babel-plugin-transform-vue-jsx The interesting part about VueJsx in my opinion is that it follows the Angular pattern for registering components. Whereas in React you just import a function that returns jsx and you can name it whatever, in Vue jsx you must declare the name and register the component globally. Vue has a component method that takes a name and an object with all relevant data. The difference being is that instead of a template
entry, there's a render
function which returns jsx.
Vue.component('jsx-example', {\n render (h) { // <-- h must be in scope\n return <div id="foo">bar</div>\n }\n})\n\n// Usage\n\n<div>\n <jsx-example/>\n</div>
\nh
is the shorthand for the Vue instance $createElement method so you have to make sure that h is in the scope of your components, like so:
const pageView = new Vue({\n el: '#root',\n data: {},\n methods: {},\n render () {\n const h = this.$createElement;\n return (\n <div>\n <jsx-example/>\n </div>\n )\n }\n});
\nFrom the get go it seems to me like we've lost some of the versatility that jsx provides by having to integrate it into the normal Vue component pattern.
\n return (\n <div\n // event listeners are prefixed with on- or nativeOn-\n on-click={this.clickHandler}\n nativeOn-click={this.nativeClickHandler}\n key="key"\n ref="ref">\n </div>
\nThere's a strange thing where on-change on a form input seems to be naturally debounced, and the nativeOn-change
doesn't seem to be any different. The behaviour doesn't seem to be the same as the React class where you can refer to an element with this.refs
, you need to use this.$refs
which follows the usual Vue convention. Since there's no documentation surrounding the jsx, I'm assuming the rest of the behaviour follows the standard Vue component pattern, but instead of a template, there's a render
function. The jsx doesn't support the normal vue directives so you'll have to do any of those things programmatically.
This article is probably no longer relevant
\nAfter much frustration with this issue, I found this section in the react material-ui documentation - React-Tap-Event-Plugin. The custom components like the select field don't work well with the traditional onClick listener, so as a temporary fix, the react-tap-event-plugin must be included in your react project. The dependency is supposedly a temporary fix. See the repo here: https://github.com/zilverline/react-tap-event-plugin
\n", "url": "https://boyleingpoint.com/blog/posts/react-material-ui-touch-events-not-firing", "title": "React Material-UI touch events not firing", "date_modified": "2016-09-24T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/essential-hip-hop-albums-of-the-decade-so-far", "content_html": "Earl Sweatshirt - Earl
\nBig Boi - Sir Lucious Left Foot
\nNas & Damian Marley - Distant Relatives
\nThe Roots - How I Got Over
\nKanye West - My Beautiful Dark Twisted Fantasy
\nKanye West & Jay Z - Watch The Throne
\nDanny Brown - XXX
\nKendrick Lamar - Section.80
\nDrake - Take Care
\nA$AP Rocky - Live.Love.A$AP
\nBlack Up - Shabazz Palaces
\nGoblin - Tyler The Creator
\nKiller Mike - Pl3dge
\nDeath Grips - Exmilitary
\nLIL UGLY MANE - MISTA THUG ISOLATION
\nScHoolboy Q - Habits and Contradictions
\nGOOD Music - Cruel Summer
\nKiller Mike - R.A.P Music
\nKendrick Lamar - Good Kid m.A.A.d City
\nFlatbush Zombies - Better off Dead
\nEarl Sweatshirt - Doris
\nChildish Gambino - because the internet
\nYoung Fathers - Tape Two
\nChance The Rapper - Acid Rap
\nA$AP Ferg - Trap Lord
\nTyler, The Creator - WOLF
\nKanye West - Yeezus
\nPusha T - My Name Is My Name
\nBusdriver - Perfect Hair
\nclipping. - CLPPNG
\nScHoolboy Q - Oxymoron
\nOpen Mike Eagle - Dark Comedy
\nVince Staples - Hell Can Wait EP
\nRun The Jewels - Run The Jewels 2
\nFreddie Gibbs & Madlib - Pinata
\nTyler, The Creator - Cherry Bomb
\nLupe Fiasco - Tetsuo and Youth
\nWale - The Album About Nothing
\nBADBADNOTGOOD and Ghostface Killa - Sour Soul
\nDrake - If You're Reading This It's Too Late
\nJoey B4DA$$ - B4DA$$
\nEarl Sweatshirt - I Don't Like Shit I Don't Go Outside
\nDeath Grips - The Powers That B
\nA$AP Rocky - At Long Last A$AP
\nDr. Dre - Compton
\nJay Rock - 90059
\nVince Staples - Summertime '06
\nKendrick Lamar - To Pimp A Butterfly
\n", "url": "https://boyleingpoint.com/blog/posts/essential-hip-hop-albums-of-the-decade-so-far", "title": "Essential Hip-Hop Albums of the Decade so far", "date_modified": "2016-09-18T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/publishing-react-components-to-npm", "content_html": "Having built and published a few React components to npm, in keeping with the plug-n-play spirit of npm, I have what I\nbelieve to be a very simple implementation for both the development and installation of components. I published a\nboilerplate project to Git/npm and this is now my go-to whenever I need to put together an external component.\nhttps://www.npmjs.com/package/@lukeboyle/react-component-boilerplate\nThe basic concept is that you have an index.jsx in a 'src' folder. This should be transpiled to ES5 and output to the\nroot directory called 'index.js'. In this instance, index.js is the "main" in your package.json. You may notice the\nentry "jsnext:main" in the package which points to the jsx file. This convention was established by\nrollup (https://github.com/rollup/rollup/wiki/jsnext:main)\nas an entry point for ES6 modules. The idea is that when you bundle using Rollup (and the ES6 import/export syntax),\nyour ES6 module will be used instead of the ES5 one. Given that we're still largely in the ES5 age, the rollup config\ngenerates an ES5 version (which is the main entry point) and an ES6 version in the src so you can feel free to write\nall the JSX goodness you please. The folder structure should roughly look like this:
\nproject-root
|--src\n| |--index.jsx\n|--index.js\n|--rollup.config.js (OR)\n|--webpack.config.js\n|--demo\n| |--dist\n| |--build files\n| |--src\n| |--src files
\nindex.jsx
import * as React from 'react';\n\nexport default function ReactComponent(props) {\n return <div>Job's Done</div>;\n}
\nAlso, to play your part in improving our package ecosystem, consider\nnamespacing your package for npm: http://blog.npmjs.org/post/116936804365/solving-npms-hard-problem-naming-packages
\n", "url": "https://boyleingpoint.com/blog/posts/publishing-react-components-to-npm", "title": "Publishing React components to npm", "date_modified": "2016-08-11T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/dynamic-product-filtering-in-shopify", "content_html": "\n\nDisclaimer: Shopify is not good. I recommend steering clear and opting for one of many alternatives. It's an extremely closed platform that doesn't encourage innovation and naturally leans towards bad practice. Given this, if you still have to use it, read on.
\n
In Shopify, there is a native (albeit 'unsupported') filtering system. Native Filtering is based on the tags you specify on your product. If you go to your collection, you can link the user to a tag and Shopify can filter product with simple Javascript like so; collections/collection-name/tag-one/tag-two. Now given that in a collection you have access to collection.all_vendors and all_types, WHY OH WHY, is there not native filtering based on that. Filtering could EASILY be dynamic if Shopify cared enough to implement that. The 'official' solution (as per the documentation; https://help.shopify.com/themes/customization/collections/filtering-a-collection-with-multiple-tag-drop-down) is to make several drop downs and set tags to be a list of tags you want to allow filtering by (e.g. tags = "red", "blue", "green"). So next week when I add a yellow shirt I have to go back into the pits and add another tag? Not happening. This is how I make filters dynamic. After searching for hours, I can conclusively say that there is no open source solution for this, and given the constraints of the garbage liquid templating engine, I can confidently say that this is the least convoluted solution available. All it takes is implementing a rigid structure in your tagging system, so this is much easier on a new store. The tag structure is basically as such: category:tagName. Let's say you want to filter your products by brand. In your product page, on the tags section, enter brand:brandName. Same goes for size:1
or color:blue
. It's up to you how many you use, because I guarantee your collection sorting template is going to be a BIG file. The best part about all this is that there's no array filter or equivalent method in liquid, so we're going to have to do some crazy shit.
{% for tag in collection.all_tags %} <-- Start iterating over all tags\n {% if tag contains 'style' %} <-- Check if it contains your keyword\n {% capture raw\\_style\\_tags %} <-- Initialise the variable \\`raw\\_style\\_tags\\`\n {{ raw\\_style\\_tags | append : tag | append: ', ' }} <-- Build a string of tags separated by commas\n {% endcapture %}\n {% assign style\\_tags = raw\\_style_tags | split: ', ' %} <-- Split the strings on the commas to build a new array\n {% endif %}\n{% endfor %}
\nThe variable style_tags
is now an array of all tags including 'style:'. Now, you will make a select field where the options are all of your style tags. Note that current_tags returns a list of the tags you are currently filtering by.
Shop by style\n All\n {% for t in style_tags %}\n {% assign tag = t | strip %}\n {% if current_tags contains tag %} <-- check if the tag is currently active - applies selected attribute\n {{ tag | remove: 'style:' }}\n {% elsif product_tags contains tag %} <-- else, just make it an option\n {{ tag | remove: 'style:' }} <\\-\\- use the remove filter to have just the tag name\n {% endif %}\n {% endfor %}
\nIf you include the Javascript from the Shopify docs, it will automatically\nlisten for changes to that .coll-filter. This way, if you ever add any more\ntags under the style:
category, you won't have to update your view. And the\nbest part is, you can just add a new category in your product page, copy paste\nthose lines of code and change 'style' to whatever your new category is called.\nI must reiterate, you should only use Shopify if you have no other choice. Cheers!
It's been about 2 and a half months since the first official full release of Agander went live, and it's out with the old in with the new.
\nOutwardly, the changes are minimal. The most obvious change is that the add module dialogue is now a modal instead of a floating column element. Various styles have been optimised and reduced as much as possible so the button sizes specifically are more consistent across browsers.
\nAround three quarters of the way through version 1 it became apparent that the app was outgrowing the constraints of the Vue system I had created, so the app has been rebuilt in React.js and Redux. The standard module model Using this model, every module has a content object and an event object under it. The content object handles calendar events, Asana workspaces and so on. Adhering to this model will allow for rapid development of new modules in future. Events The event system is simulated using the Redux middleware called Thunk. The base dispatch will set the event to executing and it will continue to execute until it is told to stop. If error is true, the event stops executing and and the error response is populated in the response key. Error false means the event resolved correctly and the response is the delicious events or tasks. React also makes rendering the correct component a breeze. I know to hide all content if the user hasn't authorised, and if the event is executing. Error messages are nice and simple too. https://youtu.be/T43RzjxwBys Next Steps Agander is being temporarily put on hold to focus on other projects - but in its current state it is very much usable. Aside from bug fixes, there will be no new features for at least a couple months while I'm working on other things. I'm really happy with how far the app has come and I can finally use it for my own agenda tracking.
\n", "url": "https://boyleingpoint.com/blog/posts/agander-2-0-is-now-out", "title": "Agander 2.0 is now out", "date_modified": "2016-06-07T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/google-calendar-api-color-id", "content_html": "When you request a Google Calendar event it will come with a colorId which is either undefined if user didn't select a colour, or between one and 11 if they did. Since I needed these for Agander, I decided to collate these for the curious. These are the corresponding colours used in the Google Calendar app.
\nColor ID | \nColor Name | \nHex Code | \n
---|---|---|
undefined | \nWho knows | \n#039be5 | \n
1 | \nLavender | \n#7986cb | \n
2 | \nSage | \n#33b679 | \n
3 | \nGrape | \n#8e24aa | \n
4 | \nFlamingo | \n#e67c73 | \n
5 | \nBanana | \n#f6c026 | \n
6 | \nTangerine | \n#f5511d | \n
7 | \nPeacock | \n#039be5 | \n
8 | \nGraphite | \n#616161 | \n
9 | \nBlueberry | \n#3f51b5 | \n
10 | \nBasil | \n#0b8043 | \n
11 | \nTomato | \n#d60000 | \n
Agander started in November 2015 with a vision to unify several of the\nproductivity services I use. With Agander I could now have one tab where\npreviously I had four or five. This post is fairly overdue, but I think it's\nworth taking the time to appreciate how far the project has come. While I did\nstart in November, the biggest progress didn't start until January 2016.\nWorking a 9-5 job and then coming home to work on Agander until 1AM has been\na struggle, but the outcome is the true reward. As of Version 0.1 in December\n(with vaporware calendar) As of Version 1.0 on March 19th - Agander has now\nentered a brief period of refactoring and optimisation, after which point, the\nnext set of integrations will be developed to create a more comprehensive\nplatform.
\n", "url": "https://boyleingpoint.com/blog/posts/agander-1-0-is-now-out", "title": "Agander 1.0 is now out", "date_modified": "2016-04-11T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } }, { "id": "https://boyleingpoint.com/blog/posts/why-i-cancelled-my-spotify-subscription", "content_html": "I've been a Spotify Premium subscriber since 2013, and I've watched a gradual change from useful to straight up garbage. Adding a local file It used to be that a paired device would appear in the sidebar and you could drag local files onto it. I'm not sure what the justification behind it is, but the paired devices no longer appear as accessible storage, as a compromise you can now use your devices as remote controllers. The current process to add local files to other devices is to add the local file to Spotify in your preferences, go to the local files in the sidebar (see below) Drag the files to a playlist and then make the playlist available offline on your mobile device. What's wrong with this?
\nI switched to Apple Music (this is not an endorsement, there's plenty wrong with Apple music too) The reason I chose apple is because
\nAfter a long battle with the weird Google Task Javascript API I've established a module for Agander that has the ability to:
\nAuthorising the user and displaying their tasks is reasonably easy following the quickstart guide here. Essentially, requests are separated into two categories; either tasks
or tasklists
. When you have loaded the tasks api, you can see the basic structure and work from there. API Reference for JS To find the tasklists, you would use the list function (returns an array of tasklist objects).
function listTaskLists(gAPI) {\n var request = gAPI.client.tasks.tasklists.list({\n 'maxResults': 10\n });\n request.execute();\n}
\nFinding tasks in a given task list operates much the same way, however, you are dealing with Google here, so it's tasks.tasks.list... Basic parameters here would just be the tasklist you want to pull tasks from, however, there are other options.
\nfunction getTasksByListId(gAPI, tasklistId) {\n var request = gAPI.client.tasks.tasks.list({\n 'tasklist': tasklistId\n })\n request.execute();
\nSo, we've covered getting the tasks, how do we manipulate it? That's where the tricky part comes in. The gapi
client interactions we used before have an update
method. However. Whenever I called update on anything, I got a 400 error with 'Invalid Value'. This is a common issue I've observed online with no real solutions. The gist of it is, that there is a bunch of 'required parameters' for you to include in the request, but there is absolutely no documentation on this (thanks Google). To get around this, we found that it was simply easier to outright request it using the request method and giving it a url. The path parameter requires a tasklist Id, and a task id. This is basically the url that comes down with the getTasksByListId request. Make sure you define the method as PUT, and you pass the whole task object with your updated values to Google. In this instance, we are marking the task as 'completed' and giving it a completed timestamp.
function markTaskComplete(gAPI, task) {\n gAPI.client.request({\n path: 'https://www.googleapis.com/tasks/v1/lists/' + tasklistId + '/tasks/' + task.id,\n method: 'PUT',\n body: Object.assign(\n {},\n task.originalTask,\n {\n completed: new Date().toISOString(),\n status: 'completed'\n }\n )\n }).execute();\n}
\nNow you have a basis, the world is your oyster.
\n", "url": "https://boyleingpoint.com/blog/posts/google-task-javascript-api-invalid-value-400-error", "title": "Google Task Javascript API - Invalid Value 400 Error", "date_modified": "2016-03-19T00:00:00.000Z", "author": { "name": "Luke Boyle", "url": "https://boyleingpoint.com" } } ] }