• Safe For Work
  • Posts
  • 🦺 Star Trek and its Technological Predictions

🦺 Star Trek and its Technological Predictions

Inventors and Inventions

Presented by

Welcome to the 35th edition of Safe For Work. Some recent safety news and then a story inspired by the technologies envisioned in Star Trek.

In Safety News

  • If you work in safety or operations, you likely appreciate and understand many physics concepts from the Laws of Motion to electromagnetic fields. This is a fascinating explanation of how the Hadron Field gives mass to objects.

  • Application of AI in sound. New technologies are now being commercialized with implications for workplace safety. One example in the story is detecting potential engine failures before they happen by a change in the engines sound.

Safety Precautions for Holo-Emotional Software (HE-7000) Implementation

Disclaimer: Emotional safety and psychological well-being are vital to maintaining a healthy and efficient workplace. Users should take care when engaging with Holo-Emotional Software to avoid emotional burnout, psychological dependency, and detachment from reality. See the HE-7000 User Manual for further guidelines.

---

“Begin program.”

“Program initializing… initializing… ready for input.”

The room gave its obligatory flicker—no one ever paid attention to the flicker anymore, which was strange. Flickers are the first sign of hardware failure, but Hugh had bigger things to worry about. Besides, this was a top-tier simulation room, thoroughly inspected and certified. They wouldn't let something unsafe near the public. Would they?

“Welcome to your session, Hugh Jordan.”

Of course it knew his name. Everything knew your name these days. $75 for an hour of self-delusion. $120 for two hours. Hugh figured he only needed one. He wasn't planning on losing himself—just a little escape. A little closure.

“Thank you,” he said to the voice that wasn’t really there. Oddly, it felt rude not to.

“Of course. Would you like to give a brief description of the experience you are hoping to have?”

This was where it got awkward. He didn’t have the words. Engineers always have trouble with words—they prefer numbers, blueprints, simulations. They don’t design emotions. Not yet.

“I need to talk to someone,” he said.

“Very good! Is the person you wish to speak to living or dead?”

“She’s alive.”

“Excellent! Are you trying to set up a holo-call?”

“No,” Hugh muttered. “I want… I was hoping…”

“Does she have a registered model and Digital Personality Facsimile with us? Or would you like to create one now?”

He hesitated. Had Nabina signed up for this? He doubted it. She was never a fan of techno-solutions to human problems.

“I’d like to create a new model and DPF.”

“Understood. Who is the individual you would like us to replicate today?”

“Nabina Guyarda.”

“Searching… match found!”

Hugh wondered briefly how long it had taken some underpaid programmer to write the algorithm that combed through public records, social media, and God knows what else, to resurrect the version of someone you wanted to see. Engineers knew that recreating a person’s face, voice, and mannerisms was easy. Recreating their spirit—well, that’s where things always went a little off the rails. But then, most people never asked for reality. They asked for comfort. Comfort, unfortunately, was highly customizable.

The holo-emitters buzzed as Nabina's form began to materialize: part memory, part code, all illusion. Hugh sighed at the sight. She was wearing the green shirt from her profile picture and the jeans from five years ago. As if past versions of her had collided, forming a ghost.

“Beginning construction of Digital Personality Facsimile. Do you have supplementary data to contribute?”

“Yeah.” Hugh had a folder prepared—pictures, texts, voice clips. Digital memories. It was all too easy to confuse memories with love, with truth. He uploaded the data. The machine whirred.

“Thank you. Construction will take several minutes; feel free to interact with your model while you wait.”

"Interact?" Hugh blinked. What does that even mean? 

“Yes,” Nabina’s echo said. Her voice was a patchwork quilt of phone calls and videos, stitched together by some underpaid software engineer who probably hated their job. “You can give me simple instructions. I will follow them, Hugh.”

He shuddered. A doll with his wife’s face, her voice, her name. But not her. No matter what they claimed, no matter what the marketing said.

“Say, ‘I was wrong,’” Hugh said.

“I was wrong,” she replied, with an eerie smile.

He laughed—short, harsh, ugly. So lifelike, so completely wrong. 

---

As Nabina continued to “interact” according to his inputs, Hugh reflected on the unseen processes at work behind the scenes. Engineers had designed these programs to respond to requests, no matter how human or manipulative. The holo-simulation didn’t care about ethics—it followed commands, executed logic, worked from preloaded data. Engineers always loved things that could be measured, calculated, controlled. Emotions were just data points now, something to analyze and program.

And yet, as her digital voice repeated lines he fed her—lines designed to elicit some long-sought apology—it hit him like a malfunctioning phase coil. What kind of safety protocols had been written into this system? Who oversaw the emotional fallout of “users” who walked into this holo-room, hoping to repair a broken past?

The system chimed again: “It appears you are interacting with a more agreeable version of Nabina Guyarda. Would you like to modify the Personality Facsimile to better suit your preferences?”

He hesitated. Engineers never hesitated. Hesitation leads to mistakes. But this? This was different.

“Yes,” Hugh said. “Make her... agreeable.”

---

In three seconds, the system recalibrated. It had all the necessary tools—facial recognition, voice synthesis, mood mapping—to produce a version of Nabina who smiled warmly, who said the right things, who was always wrong when he needed her to be.

And there she was, standing before him—his creation, not the woman who had walked away. Hugh felt a twist in his gut. They warned about “reality distortion risks” in the safety briefing, but he hadn’t read the fine print. Engineers rarely do. They trust their systems. Until they don’t.

---

Note: Ensure all users of Holo-Emotional Software HE-7000 are briefed on Emotional Integrity Compliance. Misuse of emotional simulations may result in psychological dependency, emotional dissonance, and detachment from interpersonal relationships. Refer to HR Protocol 217.9 for guidelines on Holo-Therapy use in the workplace.

See you next week as we draw inspiration from inventors and inventions, exploring the evolution of computers.

Stay safe.

Did you enjoy today's newsletter?

Select one to help us improve

Login or Subscribe to participate in polls.

Reply

or to participate.