Much of what we perceive about others in the workplace is their performatorycharacter – what others are inviting you to believe about themselves; it’s an attempt to become the idealizedversion of ourselves by acting the part. Most of us are very competent in the field, but even still everyone gets imposter’s syndrome from time to time. For the self-taught professionals in tech, it can be the dead body we keep dragging around with us even while making advancements in the field. Some university graduates, too, have struggled with this decaying corpse that plagues the tech world. Left unchecked, it often leads to a devalued sense of self, depression, and even triggers other mental health problems – even in those whose performatory character would otherwise make them appear well put together. I got into professional tech work at the age of 16, some 32 years ago, at a small computer shop building PCs. Having never had the opportunities others had to go to college, I’ve had to grow and adapt my skillset over the span of my career. Imposter’s syndrome – and depression – has been along with me for much of my adult life. Even with what continues to be an excellent career at Apple, I’ve struggled with self-worth. Work environments can be nurturing and stimulating, and bring out the best in you; they can also be demotivating and devalue you – imposter’s syndrome can follow you around through both. I’ve figured a few things out about myself over the past 32 years that have helped me navigate some difficult environments. Nobody develops imposter’s syndrome overnight. Any sickness that is chronic requires a long term cure. There’s nothing anyone can tell you that will simply fix imposter’s syndrome; there are incremental ways to slowly recover from it though.
Oxford’s definition of imposter syndrome is the persistent inability to believe that one’s success is deserved or has been legitimately achieved as a result of one’s own efforts or skills. In tech, this usually means we feel stupid because we don’t think we have the understanding or mastery we think we should. It’s interesting, though- people tend to often feel like it’s because they’re not smart enough. We are definitely smart enough to do this job. The reason we don’t have understanding isn’t because we’re missing brain cells. One thing that computer science is good at is abstractions, and that allows us to work with and learn higher level concepts without needing knowledge of the world beneath it. One might say it’s what makes computing so great. Imposter’s Syndrome seems to prey on the benefits afforded to us by abstractions to introduce uncertainties about our abilities. But there is a way to think in such a way that allows for these abstractions to exist, where X can remain unknown and it won’t bother you, but simultaneously see a universe where X fits in.
If you look at a lot of the brightest minds in computer science, there’s a distinguishable acumen about them that goes beyond simply knowing the subject matter. They have a scientific mind; able to not only explain something, but they’re able to theorize and reason about it, and able to analogize. These are the kinds of skills that make for not only a good scientist, but a good engineer. It’s these same qualities that seem most desirable when we measure ourselves up, and often what smart assholes do such a terrible job trying to mimic. But this acumen doesn’t come from reading source code, mentoring by coworkers, or from reading The Imposter’s Handbook. These qualities come from a combination of foundational knowledge, methodical reasoning, and discipline. Things a lot of self-taught people like me don’t initially get a lot of exposure to. What I think a lot of people want to feel is that they are legitimate. That their knowledge isn’t fake or piecemeal, and that they are armed with the discipline to reason, make advancements, and solve complex problems. So here’s the pat on the shoulder: You’re probably very good at the subject matter you’re trained in, and you are no doubt intelligent if you are working in tech. Here’s the hard: The abstractions we work with in computing have allowed us to develop gaps, and those gaps make us feel really dumb sometimes. To treat your imposter’s syndrome, we’ve got to work at this.
First, what foundational knowledge isn’t: It isn’t the job-related knowledge that one gets from collaborating with coworkers. It also isn’t the low level knowledge about the specific project you’re working on, that you may get by reading documentation or source code. Foundational knowledge refers to the underlying understanding of computer science and mathematics that enables one to be able to conceptualize the primitives, rules, and constraints that your computing world lives in. With those skills, one can dig into any project and understand not only how it works, but its overall place in computational theory. This gives you something far more valuable than knowledge: understanding, and is one of the gaps that a lot of self-taught engineers have. We learn plenty about computer science on the job, but this doesn’t boost our confidence in our own efforts or in our skillset; it’s too easy to just credit all of that to someone else who gave us that knowledge. Knowledge also isn’t the same as understanding, which is why knowing things doesn’t carry much currency, even with ourselves. When there are gaps in our knowledge, we can only apply what we know locally to the micro-world we’re working in; turning it into something that can be universally applied to computer science requires a foundation.
Methodical reasoning (or logical reasoning) is a thought process that involves problem solving by working through a set of rules, theories, and relations. If you jumped straight into computers from an early age like I did (8!), then you probably know more about computer logic than the mathematics behind it. Many of the logical concepts in computing have primitives that live in the deductive reasoning of the mathematical world, and so giving yourself a strong mathematical background is an important step towards developing this skill. I did not think this was important for a long time. Today’s world of mathematics has many connections all the way back to Aristotle’s syllogisms. Math is not just a numbers language, as we’re taught to think about it in school. Math is a platform for visualizing, understanding, and solving logical arguments and building relations, and does a superior job of representing logic than many computer languages.
Lastly, discipline involves applying logical reasoning to knowledge in practice, understanding the rules and constraints of a system, and learning to methodically search and develop solutions that cleanly fit into the patterns of best practices given the problem space. Practicing this will fortify reasoning, and also train you to be able to apply reasoning to any computational problem.
The way to dig yourself out of imposter’s syndrome is first to fill the foundational gap with knowledge, logicalreasoning, and discipline so that we can feel more confident in our stepping up to describe and solve problems, theorize, and advance the state of the art. Once we have a firmer foundation for logical and mathematical reasoning, we can then place the remaining unknowns of X in their proper place in the universe of computing. Knowledge alone doesn’t develop the qualities of the scientists that we aspire to. This is where many of us hit a dead end; not only is knowledge by itself mostly inert, but it also has a shelf life. If this were not true, then there would not be ten editions of Operating System Concepts, or five editions of Computer Organization, Design, and Architecture. Simply listening to college lectures on YouTube, or taking online courses doesn’t usually develop the reasoning and discipline that makes that knowledge useful to us either, and if we’re only receiving that knowledge verbally without internalizing it, we’ll retain very little of it, and apply even less.
Everyone’s background is different, and I don’t pretend to know what curricula any one person needs to fill the learning gap, but if you’re self-taught, the one thing you probably haven’t done enough of is classwork. There’s a wealth of foundational knowledge in college textbooks, sure, but they are also designed to develop reasoning by introducing you to the theory. Theorems aren’t just for proofs; they’re part of the foundation that makes reasoning possible. What is reasoning, except for an argument with one’s self while exploring a set of rules and constraints. Theorems are how you learn to search for solutions. You can better internalize the information by taking extensive notes rather than simply highlighting. They also help to develop discipline when you force yourself to do all of the exercises. It’s worth it to budget an hour or two a day to devote to personal collegiate-level studies.
Some foundational texts I would recommend for self-taught engineers include Discrete Mathematics (Lipschutz), Linear Algebra Done Right (Axler), Computer Organization, Design, and Architecture (Shiva), Introduction to the Theory of Computation (Sipser), Operating System Concepts (Silberschatz), Algorithms (Cormen), and Compilers: Principles, Techniques, and Tool (Aho). Depending on your area of interest, you may also really enjoy Applied Cryptography (Schneier), Artificial Intelligence: A Modern Approach (Russell/Norvig), and the ARM System Developer’s Guide (Sloss). Pick one mathematics book, and one computer science book to do together.
I’ve also found an interesting dovetail between computer science and philosophy that has also yielded some rather fascinating benefits, and would recommend reading the works of Hans-Georg Gadamer and Martin Heidegger, or books about their philosophy which may be easier to take in on a first pass. Philosophy is particularly prominent in the world of machine learning, but also overlaps with many other areas of computing, such as confirmation theory, social networks, and adversarial research. Heidegger’s work in truth and existential theory is very aligned with development of the scientific method.
Chances are, you know a lot more than you think you do, but lack of confidence can be crippling. It is acceptable to admit to both what you know, and what you don’t know. Learning and growing is one way to build that confidence, but also self-improve. Often times, though, we only grow forward when it’s necessary to grow backwards – into the foundations that established the knowledge we have. The long cure for imposter’s syndrome seems to be in working to develop the qualities that comes with core knowledge, reasoning, and discipline.
There are two other characteristics of a growing individual that should be byproducts of a learning process. First, strive to make advancements in the field; be your own problem generator by identifying what’s broken in computing and what could be improved. Theorize. Start to devise solutions, even if you’re not far enough into your learning to fully know how to solve them. Keep a notebook of computational problems you’d like to solve, or other advancements in the field you’d like to make. As you expand your learning, revisiting these should be inspiring. Secondly, one of the best ways to feel confident in your own abilities is to use 100% of what you do know to help bring others up with you. Even if you only know 1% of a subject, teaching someone 100% of that 1% can bring them to a new level. There are many ways to do this ranging from collaboration of studies, to mentoring, to tutoring. I bet there are people in your field who would love to learn what you know about a particular subject matter. Taking the time to actually sit down with them and teach or mentor them will not only mean the world to them, but it will help you to see your own value in what you have to offer. Studying a subject together can also benefit your own learning.
Abstractions are great for computing. They’re also terrible for self-esteem. You can’t plug every hole. Thinking with a scientist’s mindset allows you to manage those abstractions in a way that you can confidently apply the rules of computing around them. Having the foundation and discipline to apply logical reasoning to the unknowns is how AI builds their own micro-world when it cannot perceive the entire environment. We can do the same by acquiring a firm foundation in theory and mathematics to develop the same sharp acumen in this field that you perceive in others.