u/buzzlightyear0473

Does anyone see tech writing making a comeback or not actually being automated away with how AI is going?

Tech writing has been in a bad place lately due to AI and the overall economic outlook/job market, as we all know. However, I'm seeing a ton of sentiment lately that AI is starting to crash:

  • It's not actually taking people's jobs
  • AI companies are massively jacking up token costs to keep up with compute and profitability
  • More than half of the data centers are getting cancelled or delayed
  • Model performance is stalling, and banks are pulling back
  • Companies are already blowing through AI budgets, not even halfway through the year
  • NVIDIA execs are saying that AI is more expensive than workers
  • Investors aren't buying the AI washing layoffs anymore.
  • AGI is still a sci-fi concept, and LLMs are built in a way that is intrinsically impossible to achieve AGI with. No amount of throwing hardware at it and scaling can do this.

Of course, companies are still going for a last-ditch effort with mass layoffs continuing and calling out AI. We all know what happened to AWS and Snowflake, but we're finally seeing some investors scrutinize this. Microsoft and Amazon stock tanked after announcing massive spending deals, and CloudFlare stock dipped almost 20% when using AI as an excuse for layoffs the other day.

When this AI bubble pops, we'll keep having AI, of course, but it seems like we can't sustain this free lunch era for much longer. Companies will very likely pull back on AI costs when model performance begins to match pricing.

I know companies aren't seeing it now, but LLM performance fundamentally relies on documentation and human-written prompts, context, Skill file instruction, and someone who architects all this. Literally, who else does this better than a tech writer?

AI is an insanely powerful tool, but the promises these AI tech bros advertise, and what execs are buying into to appease shareholders, are a pipe dream. I know it's rough right now, but I'm convinced that this has to be a transitory period.

If anything, this is just making a stronger case for tech writers to become Information Architects in a more strategic sense.

Do you think tech writing will come back? Right now, things feel absolutely F'd with the job market and what company execs are falling for, but I don't think this will last. Even in software, I feel like the shift will just move to Content Ops and Documentation Engineering while we see traditional tech writing stay for things like Aerospace, Medical Writing, Hardware, DoD, and highly-regulated docs where the human-in-the-loop is critical.

reddit.com
u/buzzlightyear0473 — 4 days ago

I am in the final stages of an AI Governance role where I’d be taking ownership of creating the program, while collaborating with existing GRC folks and cross-functional stakeholders. I’ve been a technical writer in cybersecurity and have a lot of hybrid experience in GRC, but never a formal role with a mandate to do GRC tasks or facilitate programs.

How stressful is GRC? I understand that it’s much more social, you need to be assertive, etc., but how stressful do you think this role could be? I’d love to hear what some of the stressors are in this role and if I can prepare or evaluate my career trajectory. As someone who deals with imposter syndrome and feels a bit lost (but equally excited), I’d love to know more!

reddit.com
u/buzzlightyear0473 — 8 days ago
▲ 1 r/grc

I am in the final stages of an AI Governance role where I’d be taking ownership of creating the program, while collaborating with existing GRC folks and cross-functional stakeholders. I’ve been a technical writer in cybersecurity and have a lot of hybrid experience in GRC, but never a formal role with a mandate to do GRC tasks or facilitate programs.

How stressful is GRC? I understand that it’s much more social, you need to be assertive, etc., but how stressful do you think this role could be? I’d love to hear what some of the stressors are in this role and if I can prepare or evaluate my career trajectory. As someone who deals with imposter syndrome and feels a bit lost (but equally excited), I’d love to know more!

reddit.com
u/buzzlightyear0473 — 8 days ago

I am aspiring to pivot to the AI Governance side of cybersecurity GRC. How stressful is the job since so much of AI Governance is maturing/evolving fast along with the tech?

reddit.com
u/buzzlightyear0473 — 13 days ago

I am a technical writer likely about to be laid off in the near future after being acquired, re-orged, sunsetted projects, and new senior manager who wants to cut team size. I’ve worked at the world’s largest cybersecurity companies for 5 years and the automation push is getting worse and worse, where we went from “adapt to AI” to cutting down team members up to 50% because it’s apparently good enough for security documentation. I see no long term future in tech writing and my local job market has so cyber companies or salaries nearly as high as I make now. I’d have to go back 6 month contract jobs paying maybe 60k no benefits if I lost my current gig. With a baby due in September and a new mortgage, I’m just trying to secure a future.

I’m starting the interview process for an AI Governance role as my tech writing jobs had tons of crossover with GRC. I love documentation. I’m a “check-boxer”-brained person, and I like work life balance just making good money with a stable career with growth opportunities. Cybersecurity is interesting but GRC sounds perfect for me. I’m also extremely passionate about responsible and secure AI use as the AI threat to my current career has made me obsessed with keeping up with the trends and being a realist with it.

I hear people say GRC is not stable either and ripe for efficiency. I think just about anything is better than tech writing, but will I stand a chance being a fresher in the field with how things are going?

reddit.com
u/buzzlightyear0473 — 17 days ago