Archive

Perhaps nothing has defined higher education over the past two decades more than the rise of computer science and STEM. Since 2016, enrollment in undergraduate computer-science programs has increased nearly 49 percent. Meanwhile, humanities enrollments across the United States have withered at a clip—in some cases, shrinking entire departments to nonexistence.

But that was before the age of generative AI. ChatGPT and other chatbots can do more than compose full essays in an instant; they can also write lines of code in any number of programming languages. You can’t just type make me a video game into ChatGPT and get something that’s playable on the other end, but many programmers have now developed rudimentary smartphone apps coded by AI. In the ultimate irony, software engineers helped create AI, and now they are the American workers who think it will have the biggest impact on their livelihoods, according to a new survey from Pew Research Center. So much for learning to code.

Fiddling with the computer-science curriculum still might not be enough to maintain coding’s spot at the top of the higher-education hierarchy. “Prompt engineering,” which entails feeding phrases to large language models to make their responses more human-sounding, has already surfaced as a lucrative job option—and one perhaps better suited to English majors than computer-science grads.

The potential decline of “learn to code” doesn’t mean that the technologists are doomed to become the authors of their own obsolescence, nor that the English majors were right all along (I wish). Rather, the turmoil presented by AI could signal that exactly what students decide to major in is less important than an ability to think conceptually about the various problems that technology could help us solve.

  • psivchaz@reddthat.com
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I think this is generally true, probably for the rest of my career. I don’t think it is true forever. Asking “what happens when this stops being a career” or at least “what happens when there are less jobs to go around” is important, and something I would rather we all sort out long before I need the answer.

    • varsock@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Valid point. Again for the sake of discussion, technology evolves quickly. New tools are made out of the shortcoming of others. If docker evolves and a new tool - Kocker - is born, AI will need training data from best practices which should be generated by people.

      This could unfold in many ways. For one there could a small group of people pushing technology forward. But people will need to be around to create requirements, which takes experience.

      More likely, majority of engineers will likely just move up to a higher level of abstraction, letting new tools do the lower layer stuff. And any innovations in the lower levels of abstraction will be done by a small group of people with niche skills (take CPUs for example). This is the trend we saw historically. Assembly -> compilers -> lower languages -> interpreted languages -> scaling bare metal systems -> distributed systems -> virtual machines -> automation -> micro services etc etc