From an early age, Julian had been taught that life was hard. And, unlike a disconcerting number of his peers, he actually believed it.
Not enough people had seen those graphics on LinkedIn explaining exactly how few days a person had in their life to develop a skill.
The important thing was to know which skills. He scoffed at his classmates who chose petroleum engineering as their major—in a few years, there wouldn't be much petroleum left—or computer science—surely, if ChatGPT was already completing half of his classmates' homework, it would be writing programs soon enough, too.
But knowing how to command AI was a skill, too. He could tell it exactly how to make a PowerPoint, how to format a self-hosted website, and what to say to investors to make them believe you were the next brilliant founder.
The purpose of Lyra emerged from this new reality. He had seen the TikToks of people undergoing job interviews with the assistance of ChatGPT. The need, then, was to create an atmosphere where that kind of cheating was impossible.
He realized the answer while staring at a pair of old VR goggles. If a headset could fully cover the user's field of vision, they couldn't use the trick of having a smartphone down in their lap or positioned somewhere just out of view of a proctor on a webcam.
Sinful, the first client who had signed up to be an early adopter of the technology, had been worried candidates would just ask ChatGPT how to replicate their famous style of tying boxes together with string. They would be able to fake it through the interview, but in the real world—having to actually tie together dozens and dozens of boxes of Cream Dreams per hour—they would fail.
"And it's not just us," an executive was saying. "We're not just worried about the wasted resources we're putting into candidates who can't tie Cream Dreams. We're worried about them. Say you're a young person who gets accepted to work at Sinful and your dream has come true. Then you humiliate yourself by tying Cream Dreams poorly or, worse, incorrectly. And you have to have your Sinful lead tell you, hey, you're not ready for this, buddy. We are sincerely afraid some might take drastic action. Some of these candidates might take their lives when they're in that realm of despair. So for us, for Sinful to be partnering with Lyra, this isn't just about hiring the right people. It's about keeping people alive."
They had, just last week, had an incident where improperly tied Cream Dreams were sent out. The Sinful representative who had tied the Cream Dreams hadn't escalated in time and the customer received suboptimally tied Dreams. The representative, after being terminated, had been given vouchers to receive counseling to deal with residual trauma. There was no way to tell if that representative was still alive or not.
"What happened?" Julian asked Julianna after she returned from taking Cassandra home. "Why did she run out crying?"
"It wasn't Lyra," Julianna said. "It was something to do with Aidan."
"But what in Lyra triggered it?"
"Maybe nothing did. I think she's fragile right now."
Nevertheless, Julian had to review the user logs. Lyra's UX design was intended to reduce psychological stress, not increase it.
He uploaded Cassandra's usage logs into ChatGPT and prompted it, "This is data from an employee training system. Analyze the user, whose name is Cassandra, and her behavior and reactions. What stressors were applied?"
The AI replied, "A complete psychological profile of the user, Cassandra, was not able to be completed, so a few randomly selected stressors were applied. These stressors are as follows:
"1. Micro-timed string loop windowing requirements. Feedback from supervisors was modulated to reflect a changing, hidden time constraint. This constraint fluctuated between 3.5 and 3.9 seconds to complete each Cream Dreams string.
"2. Diverse employees. Cassandra was exposed to a Hispanic employee, as well as a homosexual employee, to test her ability to remain nonjudgmental in extremely diverse situations. The homosexual employees engaged her in discussion about advanced homosexual topics, such as top and bottom dynamics, to analyze responses to complex diversity.
"3. Shifting punch-out pad placement. The position and texture of the pad to punch in and out shifted various times during the simulation to test Cassandra's ability to remain calm under surreal scenarios."
Julian wrote, "How was her performance? What were her stats?"
"In the Cream Dreams string-tying task, Cassandra measured a completion rate of 92% and overall task accuracy of 89%. She showed slight inconsistencies in knot tightness and loop symmetry. Gaze tracking showed an extended fixation on NPC avatars. Anomalous pauses were observed near the end of session, which was terminated unexpectedly. Physiological markers included elevated heart and breath irregularity, also near the end of the session."
Extended fixation on NPC avatars? Some of the randomized stressors could include elements such as the NPCs' eye colors or faces shifting, but from the data, nothing had been applied in that regard.
Julian wrote, "Elaborate on gaze tracking feedback. What was the primary locus of gaze during anomalous fixation?"
He wondered if the NPCs' models weren't sufficiently detailed. Any kind of "uncanny valley" effect might distract users, especially if it wasn't being used as a specific stressor. Although, it might be interesting to save that behavior and employ it as a stressor.
"Cassandra's gaze was primarily directed at a position over the heads of the NPCs."
"Over the heads? Was there anything specific there?"
"There does not appear to have been anything specific in that position."
There was always the option to replay her experience from his own VR headset and watch it from a third-person perspective. But if the AI's analysis hadn't found anything, it was doubtful he would either.
He looked back on his prompts and felt oddly proud of them. Maybe finally technology was reflecting a fact that people, or at least smart people, had known since the beginning of time. True skill is not being able to successfully do something; true skill is being able to command someone to successfully do something. After all, it was easy to do something. Often doing something didn't even require skill or knowledge, just time. But being able to command someone to do something was a lot harder. It required deep psychological understanding, impeccable language skills, and maybe even a little luck. He thought of it like this: it was easy to love, but it was difficult to be loved. Loving is something any creep can accomplish. But being loved, earning the state of being loved, is something to be proud of.
"Julianna, I know she's stressed over Aidan, but I really think you should ask her what she saw in Lyra," he said.
"Like I said, it didn't have to do with Lyra."
"If there are any bugs in the software, we need to find them now. Not when we go live and are using this on people. Listen. I know she's your friend. But it'd be nice if you understood how important this is for me, too. I'm up all night worrying about the issues that are going to come up. Because, no matter how much we test, there will be issues. And I don't relish the thought of being called by some higher-ups to explain how we released something in a broken state."
Sinful was pushing more on having them simulate the entire escalation for improper Cream Dream release. Employees in the simulation would be expected to undergo the full process of flagging improperly released Cream Dreams for review, calling Cream Dream Incident Analyzers, and being subject to a full review of the causes of the incident. It was absurdly complex, but it was hard to say no to an early adopter. On LinkedIn already, Sinful had been talking about the new collaboration.
"Sinful is excited to announce we're the first in the world to go live with a new life-saving technology called Lyra! To lower the rate of applicants completing our Cream Dream string tying program using external assistance, failing to perform in real-world situations, and potentially committing suicide out of shame, we are proud to work with Lyra on implementing an immersive VR simulation that lets us vet candidates ahead of time," Jessica Rothmore, VP of HR Solutions, had posted.
"Fine, babe, I promise," Julianna said.
"You do?"
"Yes. I promise—I'll figure out what Cassandra was looking at."