ThomW 18 hours ago

I feel super fortunate to be a part of that generation where screwing around at home could lead directly to employment. I taught myself Atari BASIC on my 800 and took a 286 IBM compatible to college where I was a music major. I dropped out and landed working for an industrial automation company because I knew how to write dumb little programs. A couple years later I was the sole guy programming robots for them in a structured BASIC language.

  • acka 3 hours ago

    Whenever I read replies like these, I feel jealous of people who dropped out of college yet still managed to land a job in tech.

    In my country, the Netherlands, it was almost impossible in the late 1980s to land a tech job other than a low-level service technician (read: roadie or cable guy) if you did not have at least a bachelor's degree or higher in a tech subject or a degree from a technical community college. College dropouts were turned away before even getting an interview, and bankers would rather commit suicide than finance tech startups founded by anyone without an appropriate college degree.

    Times sure have changed.

    • bluedino 2 hours ago

      It still works both ways. I work for a very large company with no degree, doing HPC/AI

      I used to work for another very large company doing the same thing, but as a contractor. A FTE position opened on our team but I was told by HR that I wasn't qualified for the role (even though I had been doing it for a few years on the same team...) because I didn't have a degree (not a requirement for a contractor)

  • rmbyrro 6 hours ago

    pretty much my personal experience in a newer generation, just without the Atari, IBM, and basic

    a lot of employers actually like engineers who come from a personal hacking background more than traditional paths, because we're truly passionate and care deeply. we're not in for 8-5 and a paycheck.

  • chrismcb 17 hours ago

    So... The current generation? Between mobile devices, raspberry pis, Web pages, Linux and even Windows there is plenty of stuff you can do just futzing and in your basement. Yeah it might be impossible to create your own AAA game, but you can still even create your own software. Plenty of open source opportunities out there as well

    • raincole 16 hours ago

      I suppose the parent comment was referring the job market, not technology accessibility.

      • lukan 4 hours ago

        I guess the equivalent would be people getting a job via their github profile?

      • thrw42A8N 8 hours ago

        Don't ask for a million dollars per year and you'll have plenty of opportunities. There are tens of thousands of unfilled software jobs for higher than average wages.

        • iteria 7 hours ago

          But are they willing to even talk to someone who doesn't have a degree or experience? I've never worked at jobs that were super high paying. I've never seen a fresh self-taught person on a job in the last 5 years. And I've done consulting and gotten exposure to a lot a of different companies. I've also done scrappy startups. And boring small companies no one has ever heard of.

          Running into a self-taught person at all was rare, but when I did their story rarely involved not transferring from another career and leveraging some SME knowledge to get started. They already had training or a degree just not in this.

          I'm not sure screwing around at home will actually land you a job. Not anymore.

          • maccard 7 hours ago

            Yes.

            There are definitely places that won’t talk to you without a degree, but many, many places will take a degree or equivalent.

            > screwing around at home will actually land you a job. Not anymore

            I don’t think “screwing around” will land you a job whether it’s at home or at college/uni. But a degree tells me that you can stick by something for longer than a few months even when you don’t always feel like it by our own volition.

            Someone who has spent a year on and off learning to code hasn’t shown they can code or that they have any sort of consistency- both of which are (equally) as important as each other in a workplace. Someone with a degree in marine biology and a handful of GitHub projects and can pass a programming test? They’re probably my first choice. Someone with 3 years experience of writing code on their own? Absolutely. Show me those candidates and I’ll interview every one of them for a junior role.

        • jonfw 3 hours ago

          I was a self taught programmer who at one point dropped out of college to try and get into the industry earlier. I spent about a year sending out applications and got absolutely zero response.

          I go back to school for the remaining 2 years, and when I graduated I had 5 competing offers with salaries starting at double what I would have accepted when I had not finished school. This huge reversal in outcomes was purely the college degree as far as I can tell- I had less time to send out applications, no internships, and no new personal projects of any substance.

          My experience is that there are too many college grads and boot campers with github profiles to get into the industry off of some basic home tinkering.

          If you're going to do it, I imagine you've got to go one step up and stand out.

    • dyauspitr 10 hours ago

      You’re going to need a very impressive portfolio of personal projects to get a job without a degree or experience today.

    • sandworm101 14 hours ago

      >> might be impossible to create your own AAA game

      Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.

      • lawik 10 hours ago

        "Very AAA" games and Minecraft/Factorio are not related.

        Minecraft and Factorio are both simpler productions in terms of visual fidelity and lean on gameplay that is captivating. AAA is not a label for the quality of game, more of a style/level of execution.

        Both Minecraft and Factorio started indie to my knowledge which is a separate path and approach from AAA games. Unrelated to good/bad.

      • eterm 3 hours ago

        Neither are AAA.

        Also, Factorio was crowdfunded via a kickstarter-like platform.

        Also both are around 15 years old. They are both closer in age to 1995 than today.

  • ChrisMarshallNY 15 hours ago

    Same here. My first programming job was a "crossover" from a hardware technician job. It both got me into software, and introduced me to the title of "Engineer." (I was originally a Technician, then, an Electrical Engineer, even though I mostly did software, but in those days, I also designed the hardware the software ran on).

    I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.

    I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.

  • dghlsakjg 17 hours ago

    I’m not entirely sure we’re past those days.

    Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.

    When the need for juniors comes back around, I’m sure we’ll start to see it again.

    • hn_throwaway_99 10 hours ago

      > When the need for juniors comes back around, I’m sure we’ll start to see it again.

      Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.

      Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.

    • musicale 16 hours ago

      I hope this is still true. There are certainly lots of opportunities for self-taught software and hardware development. And university lectures and course material (much of which is very good) that used to be locked inside physical campuses with expensive tuition fees are often freely available to anyone on the internet.

      You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.

    • dyauspitr 9 hours ago

      Juniors aren’t coming back, not with all this AI around.

  • xattt 14 hours ago

    I’m a tech literati in a fairly tech illiterate field. My co-workers think I’m some sort of wizard when I show them basic Excel skills.

    Still waiting for my breakthrough.

    • qup 3 hours ago

      I was a witness in court last week and it was said that I was a computer whiz for knowing how to play the mp4 file on the thumb drive.

      And then later that was used against me to accuse me of lying about not knowing how to check the voicemail on my landline.

  • globalnode 17 hours ago

    the keys to employability have been captured by salesmen

dr_kiszonka 19 hours ago

The author is on HN and makes awesome art at https://andrewwulf.com

  • scrapcode 19 hours ago

    > "Today I make generative art..."

    Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.

    • swiftcoder 6 hours ago

      It's all hand-coded. Most folks in the generative art community are pretty upset about "generative AI" preempting the name.

    • indigoabstract 6 hours ago

      It's made by people, but using code instead of brushes. Andrew Wulf's art is really beautiful, both in colors and patterns.

    • erichocean 19 hours ago

      Generative art pre-AI was art created with code.

      • vajrabum 17 hours ago

        And he says on his about page "This art is primarily non-objective and abstract, focusing on complex shapes and colors. I use my math, programming, and digital manipulation knowledge to produce highly unique art." It's not AI generated.

fcatalan 9 hours ago

I was studying Physics, not out of particular interest, just because it was challenging, so I was doing badly.

I then discovered a small room that had two unsupervised computers hooked up to some mysterious world-spaning network, and made friends there, ended up leaving Physics for Computer Science.

My first job and every job in my 20s came from people I met in that room getting jobs themselves and calling me to see if I would go work with them, or someone from the previous jobs calling me back. I've never done a real job interview or sent a CV.

But then I formed a family and my social life plummeted. I'm also bad at really nurturing relationships that don't self sustain, so in retrospect I can see how my career ossified since then.

I don't totally regret it because even if I'm now underpaid and underemployed, I earn more than enough for my lifestyle and have loads of free time, so it balances the pang for greater things.

But yeah, networking is very important.

ian-g 16 hours ago

> Today I make generative art, see it on my website

I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?

otteromkram an hour ago

> So when they asked who could write 6502 assembly on an Apple II, I raised my hand figuring everyone here was a programmer — and found only my hand had been raised!

Pre comedy. I can just imagine the initial indifference when raising his hand only to look around and start lowering his hand slowly when one of the bosses just looks at him and says, "You. No, not him. YOU. You stay, everyone else can leave."

yapyap 18 hours ago

Yeah, networking can give you the world.

Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.

  • glitchc 18 hours ago

    If there's mutual interest, certainly, but in most cases networking feels shallow and forced. If the only thing in common between us is the weather, I tune out quickly. Networking is mainly for those who truly like people.

cranberryturkey 2 hours ago

I was late to programming. I had a computer in junior high but didn’t start programming until the mid 90s when I got internet access at college.

oceanparkway 2 days ago

Metal desks!

  • sjf 19 hours ago

    My forearms are getting cold just thinking about it.

commandersaki 17 hours ago

Eh, I read the article and I still don't know what it means to "talk over a wall".

  • vajrabum 17 hours ago

    If you worked in a cubicle farm you'd know. The cubicles were generally divided by low portable walls. There were different setups but generally you don't see people when you're seated but if you stand you can see your neighbors.

  • dghlsakjg 17 hours ago

    I think he means literally talking to someone on the other side of his cubicle wall.

  • GrumpyNl 8 hours ago

    For me it boils doen to, communication is key, talk to each other, exchange ideas.

  • tantalor 17 hours ago

    I think it means taking to people outside your team, about your personal interest areas.

wlindley 19 hours ago

It's a program. "App" is a word, short for "Application Program," publicized by Apple for its handheld computers that masquerade as (and are euphemistically called) "telephones." "App" effectively means "proprietary closed-source program that talks to proprietary walled-garden programs running on someone else's computer, and acts as a spy sending all your sensitive data to who-knows-where."

No-one ever called a real program an "app" before that, did they?

  • happytoexplain 18 hours ago

    "Application" has been a common general term for an end-user program for a very long time, and "app" is just an obvious abbreviation that people and UIs have used to varying degrees all along. iOS apps merely mainstreamed the term, they didn't take ownership of it.

    • TeMPOraL 8 hours ago

      iOS mainstreamed it, but for a long time, "app" had a different meaning. Like, "application" was the big full-featured thing you run on your PC; "app" was the toy thing you run on the phone.

      Then some "genius" started calling their desktop offerings "apps" (perhaps because lazy multiplatform-via- webap development eventually extended to marketing copy"), and now everything is an "app".

  • Smeevy 19 hours ago

    I've been programming professionally for over 30 years and "app", "application", and "program" have been interchangeable for me and the people I worked with as far back as I can remember.

    • peterfirefly 17 hours ago

      Operating systems are not apps. Embedded controller programs are not apps.

  • 0xcde4c3db 18 hours ago

    I don't recall seeing "app" on its own that often, but there was the idiom "killer app", meaning an application that was compelling enough to drive sales of its host platform (VisiCalc on Apple II being the go-to example).

  • tom_ 19 hours ago

    GEM on the Atari ST supported the .app (short for "application") extension for gui executables. One of its components was the AES, short for Application Environment Services. This stuff dates from the early to mid 1980s.

  • cannam 19 hours ago

    > No-one ever called a real program an "app" before that, did they?

    Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?

    For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.

    • musicale 17 hours ago

      Apple (App-le?) certainly popularized abbreviating "applications programs" or "application software" (vs. system software, systems programs etc.) to "applications" in the 1980s, and "apps" with the advent of the App Store in 2008, but Apple was unsuccessful in trying to obtain and enforce an App Store trademark given prior uses of app, store, and app store (including, perhaps ironically given Steve Jobs' return and Apple's acquisition of NeXT, a store for NeXTSTEP apps.) "Killer App(lication)" dates to the 1980s, applying to software like VisiCalc for the Apple II.

  • majormajor 19 hours ago

    "Applications" was a very common term in the classic Mac days. "Programs" was a more Windows-y term. ("Applications" vs "Program Files" in ye olden 90s world of where to put things you installed.)

    • koolba 18 hours ago

      IIRC, even the default template on Windows in the early 90s with Visual Studio was MFCApp.

  • TeMPOraL 8 hours ago

    In my experience, end-user programs you'd run and operate were called "applications" or "programs", and it was a specialist term anyway, because general population didn't think in terms of applications anyway - they thought in terms of "running Word" or "running Google", with the machine itself an implementation detail.

    As I remember it, the term "app" came from smartphones, where it referred specifically to smartphone applications. Connotations were rather negative - inferior facsimile of the real thing (not even a full application - just an "app"), also came from "app marketplace"[0][1]. And to this days, apps are inferior to their desktop counterparts (at least surviving ones), except marketing won this one - people got used to this term, then some vendors started calling desktop software "apps", and now suddenly everything is an app.

    --

    [0] - To this day I roll my eyes at the mental model here. It feels unnatural and wrong, but that's maybe because I'm not used to think in terms of trying to find markets and money-making opportunities everywhere.

    [1] - Or maybe it was just a huge letdown to me, and I'm soured ever since. Back when first iPhones and then Android came out, I was hoping a full-blown computer with a Linux on board would mean it'll be much easier to just write your own stuff for it. That it would feel more like a desktop in terms of functionality, capabilities, opportunities. I did not expect them to come up with "app stores" and lock the thing down, and made you accept the mark of the beast, entangle yourself with the commercial universe, just to be able to add your own joke app or such.

    Since then, it only got worse.