I'm using the term "robot" to refer to Artificial Intelligence (AI)/Bots/LLMs, in general. In addition, I'm talking about Large Language Models (LLMs) that can be used to process content in various ways. Basically, I'm referencing a non-human entity (software) that you can interact with and ask questions, ask to perform tasks, and use make decisions.
The thoughts in this blog are purely my own, and do not reflect any plans or strategy that SAP has with regards to LLMs and AI.
What does the LLM have to say?
Meet Poseidon, my non-human LLM. He looks real, but he's just a bot. I asked his thoughts, and here's what he said:
As an AI language model, I cannot predict the future, but I can provide some insight into the current trends in automation and the job market.
Automation and artificial intelligence have been impacting many industries, including technical writing. AI-powered tools can now assist with grammar, style, and content suggestions, making the writing process more efficient. However, these tools are not perfect and still require human input to ensure the content is accurate, engaging, and relevant.
Is my job susceptible to computerization?
This website, Will Robots Take My Job, lets you search for your job and see the probability that you will be replaced by the bots. I searched Technical Writers, and it shows we have a 79% risk of automation.... who came up with that, the bots??? Yet a podiatrist has only a 12% risk. And a Septic Tank Servicer and Sewer Pipe Cleaner is at 60% risk. Good for them, I suppose. I don't know how accurate these numbers are, but go play around with it a bit. It's mildly fun.
What's the problem with replacing humans with AI?
Artificial intelligence (AI) isn't perfect yet but can be helpful and answer a variety of questions. It has some flaws. It can spread false or unreliable information, have limited knowledge of a particular subject, can lack creativity, and even be a bit biased at times.
An AI bot consolidates information from all over the internet. Sometimes this information can be wrong or misleading. The "natural" language the bot uses, isn't always natural. In fact, it can be quite verbose. And we know with user assistance, keep it simple, concise, and accurate. Can the AI bot follow company style and branding, terminology, language, and grammar guidelines? Probably not!
Let's test it out
So, I asked Poseidon for instructions on how to load a mouse trap. Here's what it told me:
Certainly, here's a step-by-step guide on how to load a mouse trap:
Choose a mouse trap: There are different types of mouse traps available in the market, so choose the one that you are most comfortable using.
Check the trap: Before loading the mouse trap, make sure it is clean and free of debris that could interfere with its operation.
Bait the trap: Place a small amount of bait, such as cheese or peanut butter, onto the bait holder. Make sure that the bait is not so big that it interferes with the operation of the trap.
Set the trap: To set the trap, press down on the metal bar or lever on the trap until it clicks into place. This will arm the trap and make it ready to catch a mouse.
Additional steps after this that aren't worth mentioning, including "repeat steps 2-14". I'm not going to read the manual, it's TLDR. And it continues....
It is important to use caution and follow safety guidelines when handling mouse traps, as they can cause injury if mishandled. Make sure to read and follow the instructions provided by the manufacturer of your specific trap.
What would UA do?
WWUAD? If a user assistance developer was writing these instructions, the steps would be less verbose. Just the facts, man! These steps are probably on the mouse trap packaging and need to be brief. The language would be more friendly, casual, to-the-point. We research the context, users, situation, and create the appropriate content for users.
Here's how a UA might write these instructions. Let's assume the person is using a new mousetrap. We don't have to tell them to pick one, or to open it. Maybe it's just me, but I'd probably use it once and then toss it along with the mouse guts. Ideally, I'd just make a video because that's even easier. Can AI do that?
Can AI answer personal questions?
When I asked Poseidon why my "unnamed family member" was so annoying, it tried to advise me and be my therapist. No, thanks. I'm not taking advice from a bot. I just read it and rolled my eyes. Thanks, Dr. Poseidon. I think I'll rely on human advice for this type of thing. Here's what it said:
As an AI language model, I do not have personal information about your unnamed entity or your specific situation. However, it's important to remember that everyone has their own unique personalities and communication styles, and what one person finds annoying may not be the same for another.
It's also important to recognize that while certain behaviors may be frustrating, it's usually not helpful to label a person as "annoying." Instead, try to communicate your feelings and concerns to your unnamed entity in a respectful and constructive manner. Open and honest communication can often help to improve relationships and resolve issues.
Do you want AI operating on your brain?
I don't think so. Maybe it can help guide my surgeon, but I'd think some human logic and experience should be applied when cutting into my brain. Even a robotic surgery still needs an actual human, educated surgeon because I think human life has no room for error by a "glitch" in the system.
Human surgeons doing surgeryI saw on Chicago Med (one of my favorite shows) recently that they're using this "2.0" AI/robot/advisor to the doctors to work on particular patients. I know, it's not reality, but I think we're not that far away from having this in real life. I see the conflicts the doctors have with the bot's "judgement" and procedures. Let's not give away all the important jobs to bots and AI. Humans still have some useful skills to contribute! According to Will Robots Take My Job, surgeons have 0.0% automation risk. Hmmm, I guess that's a good thing, no?
Technical writing jobs may evolve as AI and automation technologies continue to improve, but it is unlikely that robots will completely replace human technical writers in the near future. Human expertise, creativity, and ability to understand complex concepts are still essential in producing high-quality technical documentation. (Source: Poseidon LLM).
What can we do?
Just like other professions, we'll have to adapt to the changing job market. We can develop skills that complement AI technologies, such as critical thinking, project management, and creativity, originality, leadership, and more. In addition, we can keep up-to-date with the latest tools and technologies so we can be more competitive in the job market. I'm not saying we will become cyborgs like Arnold in The Terminator, but we can adapt with the changing technologies and be successful.
SAP notes that posts about potential uses of generative AI and large language models are merely the individual poster's ideas and opinions, and do not represent SAP's official position or future development roadmap. SAP has no legal obligation or other commitment to pursue any course of business, or develop or release any functionality, mentioned in any post or related content on this website.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.