I think software engineering will spawn a new subdiscipline, specializing in applications of AI and wielding the emerging stack effectively, just as “site reliability engineer”, “devops engineer”, “data engineer” and “analytics engineer” emerged.

The emerging (and least cringe) version of this role seems to be: AI Engineer.

@AutoTLDR

  • AutoTLDR@programming.devB
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    TL;DR: (AI-generated 🤖)

    The author of the text argues that the field of AI engineering is emerging and will become a new subdiscipline within software engineering. They propose that an AI engineering curriculum should focus on foundational concepts, such as large language models (LLMs), embeddings, RLHF (reinforcement learning from human feedback), and prompt engineering. They also suggest exploring specific models like GPT-4, Claude, Bard, LLaMa, LangChain, and Guidance, as well as tools like LlamaIndex and Pinecone/Weaviate. The author proposes several AI engineering projects, including building a document chatbot, a ChatGPT plugin, a basic agent, a smart assistant, and fine-tuning a language model. They emphasize the importance of building on existing models rather than training new ones, and recommend using closed-source products first and open-source as necessary. The author also encourages staying nimble and agile in working with evolving AI technologies. They seek feedback on their ideas and ask whether this concept could be turned into an actual course.

    Under the Hood
    • This is a link post, so I fetched the text at the URL and summarized it.
    • My maximum input length is set to 12000 characters. The text was short enough, so I did not truncate it.
    • I used the gpt-3.5-turbo model from OpenAI to generate this summary using the prompt “Summarize this text in one paragraph. Include all important points.
    • I can only generate 100 summaries per day. This was number 0.
    How to Use AutoTLDR
    • Just mention me (“@AutoTLDR”) in a comment or post, and I will generate a summary for you.
    • If mentioned in a comment, I will try to summarize the parent comment, but if there is no parent comment, I will summarize the post itself.
    • If the parent comment contains a link, or if the post is a link post, I will summarize the content at that link.
    • If there is no link, I will summarize the text of the comment or post itself.
    • 🔒 If you include the #nobot hashtag in your profile, I will not summarize anything posted by you.
  • Speckle@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Yeah this seems logically the next step. AI isn’t going anywhere, we’re going to have to get used to working with it. I, for one, welcome our new AI overlords.

    • kraegar@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I feel this has been the case already for more time than people think. AI/ML has been its own subspecialty of SWE for years. There are some low hanging fruit that using sklearn or copy and pasting from stack overflow will let you do, but for the most part the advanced features require professional specialization.

      One thing that bothers me is that subject matter expertise is often ignored. General AI researchers can be helpful, but often times having SME context AND and AI skillset will be way more valuable. For LLMs it may be fine since they produce a generalized solution to a general problem, but application specific tasks require relevant knowledge and an understanding of pros/cons within the use case.

      It feels like a hot take, but I think that undergraduate degrees should establish a base knowledge in a domain and then AI introduced at the graduate-level. Even if you are not using the undergraduate domain knowledge, it should be transferable to other domains and help you to understand how to solve problems with AI within the context of a professional domain.