Much of what we perceive about others in the workplace is their performatorycharacter – what others are inviting you to believe about themselves; it’s an attempt to become the idealizedversion of ourselves by acting the part. For the self-taught professionals in tech, imposter’s syndrome is the dead body we keep dragging around with us while we attempt to also establish ourselves and make advancements in the field. Some university graduates, too, have struggled with this decaying corpse that plagues the tech world. Left unchecked, it often leads to a devalued sense of self, depression, and even triggers other mental health problems – even in those whose performatory character would otherwise make them appear well put together. I got into professional tech work at the age of 16, some 32 years ago, at a small computer shop building PCs. Having never had the opportunities others had to go to college, I’ve had to grow and adapt my skillset over the span of my career. Imposter’s syndrome – and depression – has been along with me for much of my adult life. Even with what continues to be an excellent career at Apple, I’ve struggled with self-worth. Work environments can be nurturing and stimulating, and bring out the best in you; they can also be demotivating and devalue you – imposter’s syndrome can follow you around through both. I’ve figured a few things out about myself over the past 32 years that have helped me navigate some difficult environments. Nobody develops imposter’s syndrome overnight. Any illness that is chronic requires a long term cure. There’s nothing anyone can tell you that will simply fix imposter’s syndrome; there are incremental ways to slowly recover from it though.
Oxford’s definition of imposter syndrome is the persistent inability to believe that one’s success is deserved or has been legitimately achieved as a result of one’s own efforts or skills. In tech, this usually means we feel stupid because we don’t think we have the understanding or mastery we think we should. It’s interesting, though- people tend to often feel like it’s because they’re not smart enough. We are definitely smart enough to do this job. The reason we don’t have understanding isn’t because we’re missing brain cells. One thing that computer science is good at is abstractions, and that allows us to work with and learn higher level concepts without needing knowledge of the world beneath it. One might say it’s what makes computing so great. Imposter’s Syndrome seems to prey on the benefits afforded to us by abstractions to introduce uncertainties about our abilities. But there is a way to think in such a way that allows for these abstractions to exist, where X can remain unknown and it won’t bother you, but simultaneously see a universe where X fits in.
If you look at a lot of the brightest minds in computer science, there’s a distinguishable acumen about them that goes beyond simply knowing the subject matter. They have a scientific mind; able to not only explain something, but they’re able to theorize and reason about it, and able to analogize. These are the kinds of skills that make for not only a good scientist, but a good engineer. It’s these same qualities that seem most desirable when we measure ourselves up, and often what smart assholes do such a terrible job trying to mimic. But this acumen doesn’t come from reading source code, mentoring by coworkers, or from reading The Imposter’s Handbook. These qualities come from a combination of foundational knowledge, methodical reasoning, and discipline. Things a lot of self-taught people like me don’t initially get a lot of exposure to. What I think a lot of people want to feel is that they are legitimate. That their knowledge isn’t fake or piecemeal, and that they are armed with the discipline to reason, make advancements, and solve complex problems. So here’s the pat on the shoulder: You’re probably very good at the subject matter you’re trained in, and you are no doubt intelligent if you are working in tech. Here’s the hard: The abstractions we work with in computing have allowed us to develop gaps, and those gaps make us feel really dumb sometimes. To treat your imposter’s syndrome, we’ve got to work at this.
I originally published this in 2012, after the Sandy Hook shooting, and dust it off every time there’s a random mass shooting in the news. This post has seen the top of my feed year after year, as politicians continue to offer nothing – nothing, but thoughts and prayers. I am now giving up, as our leadership has proven too politically ambitious to ever do anything meaningful with their careers, at the risk of losing power. This blog post will just live here from now on.
I’ve been a long time responsible gun owner, by the old definition of what that used to mean. Like a majority of them, I’ve wanted more controls on semi-automatic rifles – particularly, assault rifles, for a long time. There’s idiocy on both sides of this debate, and both have some questionable notions about them. The extreme left seems to have developed an irrational fear and hatred of all guns and the extreme right ignorantly believes the only solution to guns are more guns. Consider this more sensible perspective from someone who spent over a decade shooting and working on guns, held NRA certifications to supervise ranges and carry concealed weapons, and up until some years ago – when I sold the rights to it – produced the #1 ballistics computer in the App Store.
While often obscure to most, there is – today – a system in place to perform intensive checks of individuals looking to own firearms categorized as highly lethal; the problem is it isn’t being used to control most assault rifles. Introduced in the National Firearms Act legislation, this system was applied to machine guns, short barrel rifles, silencers, sawed off shotguns, and other types of firearms that individuals can still legally own today, but with more than the casual regulation of AR-15s and other firearms. It could be changed to include semi-automatic rifles with the stroke of a pen. In my opinion, it should be, and in this post I’ll argue why I’d like the President and legislators push for this.
The priest shall bring her and have her stand before the Lord. Then he shall take some holy water in a clay jar and put some dust from the tabernacle floor into the water. After the priest has had the woman stand before the Lord, he shall loosen her hair and place in her hands the reminder-offering, the grain offering for jealousy, while he himself holds the bitter water that brings a curse. Then the priest shall put the woman under oath and say to her, “If no other man has had sexual relations with you and you have not gone astray and become impure while married to your husband, may this bitter water that brings a curse not harm you. But if you have gone astray while married to your husband and you have made yourself impure by having sexual relations with a man other than your husband”— here the priest is to put the woman under this curse—“may the Lord cause you to become a curse[d] among your people when he makes your womb miscarry and your abdomen swell. May this water that brings a curse enter your body so that your abdomen swells or your womb miscarries. Then the woman is to say, “Amen. So be it.”
Documented use of an Abortifacient, Numbers 5:16-22
In May 2022, white evangelical Christians woke up to some rather unexpected news. A draft opinion had somehow leaked out of the Supreme Court, suggesting that Roe v. Wade would soon be overturned. Shortly after, it was. I single out white evangelicals here because, according to a recent Pew Research study, they are twice as likely to want to see abortion outlawed than other Americans (including other Christians). It would be an error though to conclude this means white evangelicals are the most pro-life. No no no, this is not the case at all. White evangelicals are no more pro-life than other religious groups, Christian or otherwise – they are, however, the most autocratic. Yet those who would use the Bible to institute government sponsored morality seem to have forgotten where the bodies are buried: also in their Bible.
The concept of abortion is nothing new. The practice of inducing an abortion as punishment for unfaithful women was once conducted as part of priestly duties in pre-Christian Judaism. A woman suspected of adultery, yet maintaining her innocence would be partially stripped, treated as an animal (right down to the presentation of an animal’s meal offering), and made to drink a type of holy water concoction; it was believed an unfaithful woman would abort her lover’s fetus and die within up to three years were she guilty (Mishnah Sotah 3). Holy water has a long tradition of being used to cleanse and purify, and so the implication was that the illegitimate fetus was evil, and therefore must be purged from the woman. Behind the scenes, this seemed to have more to do with the financial aspects of marriage contracts and intimidation than it did holiness, and the practice was eventually ended prior to the destruction of the second temple. Today’s American evangelicals take the opposing viewpoint of their ancestors – namely, against all forms of abortion – yet still firmly hold onto the practice of controlling women in much the same way. Yet while many other Christians value life just as much as autocratic evangelicals, we differ greatly from them particularly on a solution to the number of unwanted pregnancies in the country. The earliest Christians opposed abortion by adopting others’ discarded and unwanted live babies – a Roman practice known as “infant exposure” would leave abandoned babies in the trash or otherwise discarded after birth, left to die or be raised as slaves and prostitutes by others. It was this practice that many early writers condemned as “the worst abomination of all” (Philo of Alexandria). They wrote about Roman abortion practices far less. Yet while early Christians put their faith into action by sacrificially taking in these babies to save them from such a fate, today’s evangelicals largely believe opposing abortion through politics and legislation is the only solution. Most others believe it is an ineffective and dangerous solution – perhaps just as dangerous as the ancient practice that once caused them (or at least was perceived to; the practice’s effectiveness was highly questionable among rabbis).
Forced morality is likewise nothing new either. In the book of Chronicles, King Josiah breaks down the altars of false gods, tears down carved images, and rids Judah and Jerusalem of the ungodliness of the time. When his priest finds the Book of the Law, Josiah tears his robe and imposes moral rule according to the laws of the book. The chronicler Ezra writes, “Josiah removed all the detestable idols from all the territory belonging to the Israelites, and he had all who were present in Israel serve the Lord their God. As long as he lived, they did not fail to follow the Lord, the God of their ancestors.” An often overlooked detail in this story is that in spite of a society living under (and clearly practicing!) moral law, God tells Josiah that he will take his life early so that he will not see the disaster God plans to bring about. A useful object lesson can be found here: perceived morality counts for little when it is compelled. At the center of today’s controversy is not really Christian doctrine at all (there is no Christian doctrine concerning abortion), or even morality, but rather the same desire for power; today, that translates to the church’s desire for socio-economic power.
In the beginning wickedness did not exist. Nor indeed does it exist even now in those who are holy, nor does it in any way belong to their nature.
Athanasius, Against the Heathen
I’ve devoted much of the past 30 years as an evangelical Christian “layperson” to Christian studies to try and become an educated one. Greek, theology, the patristics, and Christian history should be in the wheelhouse of every Christian, yet many never study their own religion, and merely live confined to the prison of their own prejudice. It is, therefore, of little surprise that what Christianity has become in America is more or less a product of a news cycle, and less about a gospel of a meek savior. Evangelical Christianity in America broke in 2020, though perhaps some would say it’s been broken longer.
Ever since, the church stopped being recognizable – even to many Christians – in her embrace of racism, hostility, and misinformation that many Christian believers proliferate. It often failed to resemble a church at all, but rather a counterfeit designed to resemble Christianity in name only, almost certainly alien to what was truly being worshipped. The year 2020 brought some of the worst out in the mainstream evangelical church – relatives, friends, and people I’ve grown up with – who were once a much-needed example of Christianity to me – have severely disappointed in how they’d conducted themselves, causing me to question if they ever truly understood their own faith.
What more is there for their Expected One to do when he comes? To call the heathen? But they are called already. To put an end to prophet and king and vision? But this too has already happened. To expose the God-denyingness of idols? It is already exposed and condemned. Or to destroy death? It is already destroyed. What then has not come to pass that the Christ must do?
Athanasius, On the Incarnation
Christianity introduced me to a God who interacted with humanity to offer a life greater than myself. This made a lot of sense to seventeen-year-old me. It still does. Christianity in America comes with a lot of baggage, though. Along with the powerful message of the gospel come a lot of strange ideas about the creation and destruction of the world. Depictions of a violent and terrifying last days are often portrayed in both Hollywood fiction and from the pulpits of American churches. I spent many of my younger years friend to a fireball end-times preacher, who sadly died of COVID recently. Having been immersed in a church community with end-times motifs often present, it became apparent over time that evangelical Christianity seemed to have conflated faith with magic, losing touch with historical Christian beliefs. Modern interpretations of end times prophecy have become increasingly more embellished within many churches, incorporating new themes from current events into a sort of theological composite to explain present-day unrest. Such theories divorced the pattern of a historical Jesus, who advocated non-violence, with one now seemingly the perpetrator of pointless violence, judgment, and terrifying death. These beliefs have altered the entire world view of the evangelical church to adopt a militant, warfare-influenced mindset.
The concept of a violent and militant Jesus probably had its origins in the medieval period1. The idea was first codified at the Council of Nablus in 1120, where Canon 20 permitted a clergyman to take up arms in self-defense without bearing any guilt; this was during turbulent times when Christian pilgrims were often massacred by the hundreds along their journey, leaving their rotting corpses along the road from Jaffa into the Holy Land. This one concession, intended to be a temporary measure, seeded militant movements in Christianity starting with the Papal legitimization of the Templars movement (“God’s Holy Knights”), extremist groups such as Alfonso I’s Brotherhood of Belchite, the Pastoureaux, and now reaches into modern day militant Christian ideals. End-times theories today evolve within evangelical churches to reinterpret current events into an apocalyptic context. They attract fringe groups with similar mindsets, as they include the same elements – oracle-sourced apocalyptic theories that lead to violent, anti-establishment outcomes. At the very least, today’s evangelical end-times worldview gives cover to white supremacy, replacement theory, and anti-government extremism. Yet this is in conflict with the teachings of Christ and hundreds of years of church fathers about martyrdom, pacifism, and government non-involvement. The obvious contradiction of a Christianity asserting a struggle that is “not against flesh and blood” somehow ending up with a literal war against flesh and blood is the result of a theological evolution that influenced how the church interprets scripture and forms doctrine today. To not believe in a brutal and imminent end times means, in many churches, that you don’t have a Christian faith at all.
Theories about masks, vaccines, the World Health Organization, and a new president are popular topics of recent end-times discussion within churches. The idea that anyone can speculate on end-times prophecy has attracted conspiracy groups like QAnon, which now represents up to 25% of white American evangelicals. Denominationalism, while having some benefit, has also become a significant enabler of confirmation bias in the church, allowing for tribal systems of otherwise fringe beliefs to find support. These beliefs have become more extreme as a result of the social dysfunction created by COVID and deep divisions in politics. Beliefs about masks, vaccines, and other current topics are now loosely joined to end-times concepts of one world government, the mark of the beast, eternal punishment, or other themes in Revelation. Conspiracy theories within the church’s walls have had very real consequences. A study from the Public Religion Research Institute (PRRI) showed that only a mere 41% of white evangelicals believe scripture provides no reason to refuse the COVID vaccine – that’s 59% of white evangelicals who think otherwise. The same polling organization found that 18% of all Americans believe in the QAnon conspiracy the “government, media and financial worlds in the U.S. are controlled by a group of Satan-worshipping pedophiles who run a global child sex-trafficking operation”. The most extreme example of end-times prophecy going off the rails was seen on January 6, where insurrectionists attempted a coup within the congress, driven by QAnon conspiracy theories. As one evangelical pastor put it, “Right now QAnon is still on the fringes of evangelicalism… but we have a pretty big fringe.”
The modern-day evangelical end-times posture can be walked back to a shift in theological interpretation of the mid-1800s. The interpretive biases that posit this theology have altered Christianity in many significant ways. Yet concepts of a sudden secret rapture, seven years of tribulation, and a thousand-year earthly kingdom all rest upon theological pillars of highly questionable origin. Such last days concepts have no support in historic Christianity, and could be divorced from Christianity altogether. Many evangelicals, having been raised in this mindset, will deny vaccines and literally die on the basis of the theological system under which they were taught, firmly believing that they are honoring God in doing so. Yet it is a flawed and unfalsifiable system of theology – not Christianity itself – that is to blame. Let us attempt to tease those two concepts apart.
“For no property of God which the mind can grasp is more characteristic of Him than existence, since existence, in the absolute sense, cannot be predicated of that which shall come to an end, or of that which has had a beginning, and He who now joins continuity of being with the possession of perfect felicity could not in the past, nor can in the future, be non-existent; for whatsoever is Divine can neither be originated nor destroyed. Wherefore, since God’s eternity is inseparable from Himself, it was worthy of Him to reveal this one thing, that He is, as the assurance of His absolute eternity.”
On the Trinity St. Hilary of Poitiers
I’ve often been asked why an intellectual type guy such as myself would believe in God – a figure most Americans equate to a good bedtime story, or a religious symbol for people who need that sort of thing. After about 30 years of life as a Christian, my faith in God is the only thing that’s peeled me off the pavement through many hard times in my life, and helped keep me grounded during COVID. What God has to say about me – as a human – having intrinsic value , and deserving love (even in times when I didn’t love myself), is likely the only reason I hadn’t pulled the trigger a few times in my life. But it is far from a crutch; it has pushed me to conquer my own selfishness as a human, to learn to forgive, to suffer myself to be defrauded for the sake of my testimony, and to serve something greater than myself. Striving to understand God, especially through all of the American nonsense that is in the church today, has been a thought provoking and captivating journey as well. I wasn’t raised in a Christian home, nor did I have any real preconceived notions about concepts such as church or the Bible. I didn’t really understand Christianity at all through my youth, other than from the perspective of an outsider – all I had figured was that he was a religious symbol for religious people.
Today’s perception of Christianity in America is that of a hate-filled group of racists that are too stupid to take a vaccine. A title that many so-called Christians have rightfully earned for themselves. This doesn’t represent Christianity any more than the other extremes do, though, and even atheists know this. There is a real standard we are called to meet as Christians, and much of this country has fallen short. It doesn’t mean that God isn’t who he said he is, and it doesn’t move the bar of accountability for those that profess to be Christian. There are countless people who are not of this stereotype, who strive to love and to do good, who won’t judge you, and who try their best to walk out a life worthy of the Christian faith.
I’ve been a Christian since 1993, and am convinced, based on my experiences and my understanding, that God is more than just a story. But it takes looking outside of the white American evangelical culture that’s often portrayed as Christianity to understand what God is about. I think most people already know in their heart who God is, and that’s why they’re so averse to the church. In recent times, there has been a cognitive dissonance between historical Christianity and the way the church behaves. Christians are equally mystified by this – but it does not invalidate everything that’s been written about God.
As the angst and stir-craziness start to set in from the world suddenly being forced into lockdown, I’ve seen a lot of articles about working from home, by people in all walks of life, from programmers to astronauts. Most of them offer practical beginner advice, like go outside, plan a schedule, etc. etc. That’s all good advice to take in, but after a few weeks, you’re probably realizing there’s a lot more to making this work well. As the reality of our predicament is starting to sink in, it’s important to start thinking about the psychological demands of working from home. I’ve spent the better part of my 25 year career working from home, and when I started thinking about what, if any, wisdom I could share on how to make it work well, found that I’d come up with a lot of the same things I’d already shared in a post two years ago, Living With Depression in Tech. Working at home has some fantastic benefits, but also challenges that go far beyond basic discipline development. Being productive and successful at home comes down to changing your perspective – focusing on the impacts you’re having, believing in what you’re doing, and finding ways to grow and thrive on your own so that you can maintain your drive over the long haul.
Joshua Harris, the author of “I Kissed Dating Goodbye”, recently renounced his faith and apologized for his awful book. I remember when it came out in the late 90’s, and still see the lasting damage it inflicted on two generations of young men and women. Harris ended up creating a toxic culture inside the mainstream church that would take two generations of Christian men back into the dark ages of devaluing women based on their level of sexual indiscretion, and helped fan the flames of homophobia and exclusion. His “sexual prosperity gospel”, as it’s been called, led to a life of guilt and shame for many, and created lasting scars that caused some to abandon their faith or their marriages later on in life.
Christianity teaches that a person’s worth has nothing to do with their sexual history (or orientation), but from Jesus, who was willing to die to reconcile humanity to God. We’re not defined by our sins, and we’re not defined by our past; we are defined by Christ. This is a far cry from the cultish fundamentalist legalism that Harris’s church taught for decades; the purity movement amounted to nothing more than a way for Christians to measure themselves and others up. It’s no surprise that Harris renounced his faith; if the faith he was practicing was grounded in such a flawed understanding of grace and intrinsic human worth, then by any measurement it was not Christianity. The truly sad part is that he convinced millions of Christians to adopt this same world view for more than 20 years, allowing it to hurt a lot of people before it became popular for leaders to finally speak out against it. Sorry, Josh, but an apology doesn’t let you off the hook.
But this failure wasn’t just of Harris’s own making: It was the complete failure of church leaders everywhere in elevating Harris’s status to a Christian leader. Harris was a mere 21 years old, and hadn’t even been to seminary yet when he wrote the book. Rather than rightfully dismissing his book as yet more of the trash writing of that era, the inexperienced youth leaders of that time (many of whom also lacked formal training) saw a way to get kids to act responsibly, without considering the consequences of his legalism. From piecing together accounts online, Harris’s own church reeked of a world of deep-seated problems, including sexual abuse coverup, abuses of power, control and manipulation of their congregation, and legalism running rampant. The church had become so damaging, much of his congregation ended up leaving, and there’s an entire blog dedicated to victims trying to recover from Harris and the rest of his church’s leaders. Indeed, it’s very telling to see the kind of culture his book came out of, and the horrifying fruits of it. When you read that Josh Harris has departed Christianity, this appears by all accounts to be a very good thing for Christianity.
I’ve been trying to avoid writing about depression for a while now. Almost nobody in tech wants to talk about things like this. A stigma still very much exists around mental illness, and in tech with all its flaming, trolling, and fragile manhood egos, people have learned to be thick-skinned. It’s taken me years to realize that I never stopped struggling with depression throughout my dysfunctional childhood, and I’ve carried it through my teens and adult life with me. I was diagnosed and medicated as a teen, but didn’t fully understand that it still haunted me, playing the same old record grooves in my brain in adulthood. As my thyroid disease began accelerating, I needed to work even harder to maintain balance or the world would come crashing in. Struggling through my career and relationships, things became easier after I understood what was going on inside of me. I feel a certain responsibility to bring to light what is likely a widespread issue in the tech community.
Depression can manifest itself in various forms for different people, and my story isn’t “everyone’s” story. I can only write from my own personal experiences. Most of this has had lifelong personal struggles unrelated to work, and while one can probably deduce this, the focus of this post is handling professional challenges. You might identify with some of these issues, and that’s great if this post helps, but it also shouldn’t be used for self-diagnosis. Depression has been far worse than the details I’m willing to share publicly, and if you think you may be depressed, you should seek professional counseling.
I have no background in psychology; I’m just sharing what works for me. I have no background in medicine either, and having been on and off medication, I can’t recommend one way or the other. I do know that all medication has its limits, so learning how to cope is an important part to having a complete life plan. At the end of the day, I can’t solve your depression (or mine), but I can share how I’ve coped with it, and won some victories. This is a survival story that hopefully might have some meaningful advice for others.
The current young generation will soon have grown up without ever knowing what it’s like to not have social media. They’re also growing up without a sense of how society was before social media came into play. Whether you use social media or not, it’s likely affected your life because it’s changed how people relate to one another – including you. While there are many good aspects of social media and the concept of bringing people together, there are also many negative changes it’s had on how we relate to one another.
I’ve spent a lot of time observing others and how social media has affected them online over time, and seen the problems it can create. For me personally, I’ve never been happier to be off of social media than the past year or so when I finally ditched Twitter for good. Twitter is a creepy and toxic place, which seems to be exactly what their CEO wants it to be. I found that I didn’t like the person I had to become in order to stay on it. Most social media is a dumpster fire, but Twitter was a particularly awful experience. It simply isn’t worth the stress and distraction in order to relate to a bunch of randos on the Internet whose only goal in life is to cause misery. Social media doesn’t deserve to have the power to change you, but they do. Getting back to the humanity of relationships is almost like waking up from a bad dream: you’d almost forgotten the goodness in what normal relationships with others (professional, friendships, etc.) feels like.
So at the risk of the next generation never knowing what it’s like to have a normal relationship with others, I’ve written down just a few of the things that are important in building friendships and other types of relationships – things social media seems to have endangered… at least, from the perspective of this old Gen-X’er. Writing all of this makes me really miss how people were before social media existed.
I was just a teenager when I got involved in the open source community. I remember talking with an old bearded guy once about how this new organization, GNU, is going to change everything. Over the years, I mucked around with a number of different OSS tools and operating systems, got excited when symmetric multiprocessing came to BSD, screwed around with Linux boot and root disks, and had become both engaged and enthralled with the new community that had developed around Unix over the years. That same spirit was simultaneously shared outside of the Unix world, too. Apple user groups met frequently to share new programs we were working on with our ][c’s, and later our ][gs’s and Macs, exchange new shareware (which we actually paid for, because the authors deserved it), and to buy stacks of floppies of the latest fonts or system disks. We often demoed our new inventions, shared and exchanged the source code to our BBS systems, games, or anything else we were working on, and made the agendas of our user groups community efforts to teach and understand the awful protocols, APIs, and compilers we had at the time. This was my first experience with open source. Maybe it was not yours, although I hope yours was just as positive.
It wasn’t open source that people were excited about, and we didn’t really even call it open source at first. It was computer science in general. Computer science was a brand new world of discovery for many of us, and open source was merely the bi-product of natural curiosity and the desire to share knowledge and collaborate. You could call it hacking, but at the time we didn’t know what the hell we were doing, or what to call it. The environment, at the time, was positive, open, and supportive; words that, unfortunately, you probably wouldn’t associate with open source today. You could split hairs and call this the “computing” or “hacking” community, but at the time all of these things were intertwined, and you couldn’t tease them apart without destroying them all: perhaps that’s what went wrong, eventually we did.
Back in the late 1960s, University of California, Berkeley, published its first public BSD licenses promoting free software that could be reused by anyone. A few years later, in the 70s, BSD Unix was released by CSRG, a research group inside of Berkeley, and laid the foundation for many operating systems (including Mac OS X) as we know it today. It gradually evolved over time to support socket models, TCP/IP, Unix’s file model, and a lot more. You’ll find traces of all of these principals – and very often, core code itself, still used 50 years later in cutting edge operating systems. The idea of “free software” (whether “free as in beer” or “free as in freedom”) is credited as a driving force behind today’s technology, multi-billion dollar fortune companies, and even the iPhone or Android device sitting in your pocket. Here’s the rub: None of it was ever really free.
Today, I uninstalled Firefox from my computer. There was no fanfare, or large protest, or media coverage of the event. In fact, I’m sure many have recently sworn off Firefox lately, but unlike the rest of those who did, my reasons had nothing to do with whether I support or don’t support gay marriage, proposition 8, or whatever. Nor did they have anything to do with my opinion on whether Brendan Eich was fit to be CEO, or whether I thought he was anti-gay. In fact, I would have uninstalled Firefox today regardless of what my position is on the gay marriage issue, or any other political issue for that matter. Instead, I uninstalled Firefox today for one simple reason: in the tendering of Eich’s resignation, Mozilla crossed over from a company that had previously taken a neutral, non-participatory approach to politics, to an organization that has demonstrated that it will now make vital business decisions based on the whim of popular opinion. By changing Mozilla’s direction to pander to the political and social pressure ignited by a small subset of activists, Mozilla has now joined the ranks of many large organizations in adopting what once was, and should be considered taboo: lack of corporate neutrality. It doesn’t matter what those positions are, or what the popular opinion is, Mozilla has violated its ethical responsibility to, as an organization, remain neutral to such topics. Unfortunately, this country is now owned by businesses that violate this same ethical responsibility.
Corporations have rapidly stepped up lobbying and funneling money into their favorite political vices over the past decade. This radicalization of corporate America climaxed in 2010, when what was left of the Tillman Act (a law passed in 1907 to restrict corporate campaign contributions), was essentially destroyed, virtually unrestricting the corporate world from holding politicians in their back pocket through financial contributions. Shortly before, and since then, America has seen a massive spike in the amount of public, overt political lobbying – not by people, not by voters, but by faceless organizations (without voting rights). What used to be a filthy act often associated with companies like tobacco manufacturers has now become a standard mechanism for manipulating politics. Starbucks has recently, and very rudely, informed its customers that they don’t want their business if they don’t support gay marriage, or if they are gun owners – in other words, if you don’t agree with the values of the CEO, you aren’t welcome in their public business. This very day, 36 large corporations, including some that have no offices in Oregon, are rallying in support of gay marriage in Oregon. The CEO of Whole Foods has come out publicly in protest of the Affordable Care Act. Regardless of your views on any of these, there’s a bigger problem here: it has now become accepted that corporate America can tell you what to believe.
Many governments (including our own, here in the US) would have its citizens believe that privacy is a switch (that is, you either reasonably expect it, or you don’t). This has been demonstrated in many legal tests, and abused in many circumstances ranging from spying on electronic mail, to drones in our airspace monitoring the movements of private citizens. But privacy doesn’t work like a switch – at least it shouldn’t for a country that recognizes that privacy is an inherent right. In fact, privacy, like other components to security, works in layers. While the legal system might have us believe that privacy is switched off the moment we step outside, the intent of our Constitution’s Fourth Amendment (and our basic right, with or without it hard-coded into the Constitution) suggest otherwise; in fact, the Fourth Amendment was designed in part to protect the citizen in public. If our society can be convinced that privacy is a switch, however, then a government can make the case for flipping off that switch in any circumstance they want. Because no-one can ever practice perfect security, it’s easier for a government to simply draw a line at our front door. The right to privacy in public is one that is being very quickly stripped from our society by politicians and lawyers. Our current legal process for dealing with privacy misses one core component which adds dimension to privacy, and that is scope. Scope of privacy is present in many forms of logic that we naturally express as humans. Everything from computer programs to our natural technique for conveying third grade secrets (by cupping our hands over our mouth) demonstrates that we have a natural expectation of scope in privacy.
I don’t normally write about such personal topics as family illnesses, but it is my hope that those who have gone through a similarly dark cooridor in their life – whether as a result of government control, or just plain ignorant doctors – would know that they are not alone in such frustrations, and to explain to the general oblivious public and incompetent lawmakers the consequences of their actions.
“Don’t ask yourself what the world needs. Ask yourself what makes you come alive and then go do that. Because what the world needs is people who have come alive.” – Howard Thurman
A friend of mine was going on about really knowing people; “people… are not defined by what they do”, he said. The point he was making was not to judge people by the cover of what they do in life. But the deeper point that he may not have realized, was the tragedy in the truth of that statement. How tragic it is that we aren’t defined by what we do. It seems to me that, given the finite amount of time we have to live and become, that we spend more of our lives thinking about what we want to do than actually doing it.
I recently did a talk at O’Reilly’s Ignite Boston party about the exciting iPhone forensics community emerging in law enforcement circles. With all of the excitement came shame, however; not for me, but for everyone in the audience who had bought an iPhone and put something otherwise embarrassing or private on it. Very few people, it seemed, were fully aware of just how much personal data the iPhone retains, in spite of the fact that Apple has known about it for quite some time. In spite of the impressive quantities of beer that get drunk at Tommy Doyle’s, I was surprised to find that many people were sober enough to turn their epiphany about privacy into a discussion about full disclosure. This has been a hot topic in the iPhone development community lately, and I have spent much time pleading with the different camps to return to embracing the practice of full disclosure.
It looks like I missed the 1960s, but I’ve read that there were plenty of free drugs and free sex to go around. One thing that apparently wasn’t free, though, was telephone equipment. And behind all of the groovy things to do back then, the one thing nerds seemed to be into was having fun with the telephone networks. The digital telephone network was brand new, and so consumer ignorance was at an all-time high. This made for easy profiting – AT&T had made a killing by charging their customers not only for telephone service, but to pay usage and equipment rental fees for telephones, answering machines, and anything else you wanted to plug into your phone jack.
Countless sermons have been preached instructing people to give, and God will let you have the car you want, the house you want, and the life you want. Amusingly, my web logs indicate that this essay is found frequently by pastors Googling for prosperity sermons to preach on Sunday. It seems strange, though, that a people who profess to follow Christ are so anxious to convince the church that God wants them to be rich, when the Bible teaches no such thing – God has promised us no such prosperity, but only trials, tribulation, and possibly martyrdom. James teaches us that there’s something profoundly wrong with a miser, treating the notion of being rich as a sign of poor character in their lack of generosity. So are pastors just in error, wanting to see their congregation blessed in this consumer driven American culture, or are they preaching up promises of breakthroughs and finances because they know they’ll reap some of the benefits? In either case, Christians shouldn’t be so naive, given the role models we have in Jesus and the apostles.
I’ve spent many late evenings over the past month translating and researching an intriguing early Christian manuscript called the Didache. Greek for teaching, this first century Greek manuscript reveals the life and heart of the early Church. It has been the center of much academic interest and controversy since its rediscovery in 1883. Prior to this, it was once thought lost to history, although many early church fathers including Athanasius, Rufinus, and John of Damascas cited the book as inspired scripture. It was also accepted into the Apostolic Constitutions Canon 85 and the 81-book Ethiopic Canon. Many early church fathers including Barnabas, Irenaeus, Clement of Alexandria, and Origen either quote or reference the Didache.