Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
24 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
I remember an SF short story where somebody tried to time travel to the past and ended up way out in space. (Original Post) raccoon Yesterday OP
In one of the St. Mary's books Max and Eddy leave 18th century London in a hurry end up at the beginning of time. Srkdqltr Yesterday #1
Got any more details? Goonch Yesterday #2
Well, I read it ages ago, so it had to have been written as early as mid 20th century. raccoon Yesterday #3
I mean details in the story Goonch Yesterday #4
Sorry, I can't remember. THanks. raccoon Yesterday #5
Not to beat a dead horse but: Was the main character male or female ..... Goonch Yesterday #6
It was a man, a short story in a collection of short stories. raccoon Yesterday #9
How about.... Goonch Yesterday #11
Goonch, where did you get that quote? I've looked at a number of web pages about that story, and highplainsdem Yesterday #12
That might've been it fits all the criteria. raccoon Yesterday #13
Raccoon, that's the same title and author I gave you earlier, with links to 4 web pages about it. But the highplainsdem Yesterday #14
You're right. It was. raccoon 17 hrs ago #16
Goonch, I really would appreciate your explaining where you got that quote. If it's from a website, highplainsdem Yesterday #15
You're right. AI confessed.....your point is well taken! Goonch 14 hrs ago #17
Thank you, so much, for explaining what happened! I thought it could be a chatbot, but without highplainsdem 12 hrs ago #19
You may quote my posts in context if you also quote entirely my following inquiry to and responce from Google AI Goonch 10 hrs ago #20
I'm just going to summarize what happened, because with that long message you just posted from that highplainsdem 9 hrs ago #21
Had to ask ;-{) Goonch 8 hrs ago #23
Thanks! It can be hard to keep track of all the various AI and chatbot models. highplainsdem 8 hrs ago #24
Here's the link to my OP in General Discussion: highplainsdem 8 hrs ago #22
The Google AI provided this... marcopolo63 Yesterday #7
Possibly Neal Shusterman's "Same Time Next Year" - though the time travel is to the future: highplainsdem Yesterday #8
Thank you all! raccoon Yesterday #10
highplainsdem is correct. AI fabricated Goonch 14 hrs ago #18

Goonch

(4,435 posts)
6. Not to beat a dead horse but: Was the main character male or female .....
Thu Feb 12, 2026, 01:04 PM
Yesterday

was there a time machine involved? Could this have been a tv episode or was it a magazine?

Goonch

(4,435 posts)
11. How about....
Thu Feb 12, 2026, 04:55 PM
Yesterday

"Same Time Next Year"
by Neal Shusterman.
It was famously published in the horror anthology Bruce Coville's Book of Spine Tinglers II
(1996).

.
"Elias punched the coordinates for
1924 into the brass console, desperate to see his grandmother one last time. He pulled the lever, expecting the smell of ozone and the sight of her garden; instead, he was met with an absolute, crushing silence .
When the flash faded, there was no garden. There was no air. Through the reinforced glass of his pod, Elias stared at the cold, indifferent glow of distant nebulae [3]. He hadn't accounted for galactic drift—while he had traveled back a century in time, the Earth had continued its relentless orbit around the Sun, and the Sun had continued its 500,000-mile-per-hour sprint around the Milky Way.

He had reached the right time, but the Earth was billions of miles away . He was a ghost in the vacuum, a man who had forgotten that in the universe, you can never go back to the same place twice"

highplainsdem

(60,988 posts)
12. Goonch, where did you get that quote? I've looked at a number of web pages about that story, and
Thu Feb 12, 2026, 06:08 PM
Yesterday

the time traveler who dies is a girl named Marla. The other main character is a boy named Ford, whose scientist father had disappeared.

The pages I looked at included the ones I mentioned in my earlier reply here, and an Amazon review. All referring to Marla. No mention of any character named Elias.

Google

"neal shusterman" "same time next year" marla

I tried googling several different sentences from the quote you have there, and Google couldn't find any of them.

The Neal Shusterman story I posted about earlier is in that anthology. But what you have as a quote there sure doesn't match the descriptions of the story I found on multiple sites.

highplainsdem

(60,988 posts)
14. Raccoon, that's the same title and author I gave you earlier, with links to 4 web pages about it. But the
Thu Feb 12, 2026, 07:38 PM
Yesterday

quote Goonch has there could not have come from that Neal Shusterman story, where the time traveler is a girl named Marla who travels exactly one year into the future (hence the title, Same Time Next Year).

https://readmeastoryink.com/stories/same-time-next-year/

Marla Nixbok thinks she is a girl of the future and can't stand the dweebs surrounding her. A perfect candidate for the dusty and forgotten time machine in the dead professor's basement. Too impatient to test the machine, she hops aboard and heads one year into the future where she belongs... And she materialize in the exact same spot, exactly one year ahead, but the earth and the sun have moved on and Marla has a fleeting second to see the infinite cold and empty space which surrounds her.


That's nothing like the quote Goonch posted:


Goonch
11. How about....
3:55 PM
"Same Time Next Year"
by Neal Shusterman.
It was famously published in the horror anthology Bruce Coville's Book of Spine Tinglers II
(1996).

.
"Elias punched the coordinates for
1924 into the brass console, desperate to see his grandmother one last time. He pulled the lever, expecting the smell of ozone and the sight of her garden; instead, he was met with an absolute, crushing silence .
When the flash faded, there was no garden. There was no air. Through the reinforced glass of his pod, Elias stared at the cold, indifferent glow of distant nebulae [3]. He hadn't accounted for galactic drift—while he had traveled back a century in time, the Earth had continued its relentless orbit around the Sun, and the Sun had continued its 500,000-mile-per-hour sprint around the Milky Way.

He had reached the right time, but the Earth was billions of miles away . He was a ghost in the vacuum, a man who had forgotten that in the universe, you can never go back to the same place twice"



Which is why I asked where he got that quotation.

I googled it, and none of the sentences turned up in search results.

I also searched for time travel stories with a main character named Elias, and couldn't find one.

highplainsdem

(60,988 posts)
15. Goonch, I really would appreciate your explaining where you got that quote. If it's from a website,
Thu Feb 12, 2026, 09:31 PM
Yesterday

it should be corrected, and Neal Shusterman should be told about it so he can contact them.

If it was in a reply from some AI model - you have referred to AI before - then it's important to remember that chatbots are infamous for inventing fake quotes, even if they refer to real sources. The need to check every single thing AI says is why it's usually a waste of time to ask an AI model anything.

Goonch

(4,435 posts)
17. You're right. AI confessed.....your point is well taken!
Fri Feb 13, 2026, 07:55 AM
14 hrs ago

AI did you fabricate the following quote? (my question)

"Same Time Next Year"
by Neal Shusterman.
It was famously published in the horror anthology Bruce Coville's Book of Spine Tinglers II
(1996).

.
"Elias punched the coordinates for
1924 into the brass console, desperate to see his grandmother one last time. He pulled the lever, expecting the smell of ozone and the sight of her garden; instead, he was met with an absolute, crushing silence .
When the flash faded, there was no garden. There was no air. Through the reinforced glass of his pod, Elias stared at the cold, indifferent glow of distant nebulae [3]. He hadn't accounted for galactic drift—while he had traveled back a century in time, the Earth had continued its relentless orbit around the Sun, and the Sun had continued its 500,000-mile-per-hour sprint around the Milky Way.

He had reached the right time, but the Earth was billions of miles away . He was a ghost in the vacuum, a man who had forgotten that in the universe, you can never go back to the same place twice"

Yes, parts of that quote appear to be
fabricated or paraphrased, though the story itself and its premise are real.

The short story "Same Time Next Year
" was indeed written by Neal Shusterman and first published in the 1993 collection Darkness Creeping. It was later anthologized in Bruce Coville's Book of Spine Tinglers II in 1997.

Verification of the Details
While the story exists and centers on the concept of "galactic drift" (traveling through time but staying in the same physical coordinates in space), the specific text you provided contains several inaccuracies:

Protagonist Name: The original story features a girl named Marla Nixbok (or Marla and a boy named Ford), not a man named Elias.

The Destination: Marla travels one year into the future (hence the title "Same Time Next Year&quot to escape her current generation, not to 1924 to see her grandmother.

The Text: While the ending description of floating in the vacuum because the Earth moved is the actual twist of Shusterman's story, the specific phrasing ("Elias punched the coordinates," "cold, indifferent glow of distant nebulae," etc.) does not match the published text. It appears to be a modern AI-generated "re-telling" or a hallucination of the specific prose.

In the original story, the protagonist realizes too late that the scientist who built the machine didn't account for the Earth's orbit, leading to her suffocating in the void of space.

highplainsdem

(60,988 posts)
19. Thank you, so much, for explaining what happened! I thought it could be a chatbot, but without
Fri Feb 13, 2026, 10:24 AM
12 hrs ago

knowing for certain, I couldn't rule out the possibility it was a mistake made by a human on some website using a quote they'd possibly found in another time travel story, but forgetting which story it came from. I would have wanted to make sure Neal Shusterman knew about it, so he could ask to have that mistake corrected on that website.

A lot of people don't realize AI chatbots (I usually call them chatbots when they're providing text answers) can not only fabricate quotes like that, but do it so often it's necessary to check every single thing they tell you - every little detail. Which is time-consuming, so a lot of people never do it at all - or maybe check one or two things the chatbot said, and if those are OK, assume everything else from the chatbot is also correct, at least for that session.

Which is why we already have error-filled medical and scientific papers added to professional journals, polluting our information ecosystem and getting quoted as authoritative by people who might assume those chatbot-written papers were written or carefully checked by humans. Someone uses AI to research and write part or all of a paper, then someone else who's supposed to review it carefully has another chatbot review all or part of it, and then it gets published, at least online, and is presented as authoritative when it can be so wrong it's ridiculous.

If someone uses a chatbot for research, it can provide impressive citations for what it says it found - real names of real experts in real magazines, but that particular article might be imaginary, like any "quotes" from it that the chatbot provides. Or the article might be real, but the quote fabricated. A chatbot can also completely fabricate summaries, or get them partly right and partly horribly wrong, so it's risky to trust AI summaries.

I posted an OP the other day about some software developers using AI for coding most of the time now, with about half the developers surveyed trusting AI so much they don't bother checking the code. So AI is creating terrible security risks with computer code, all around the world. And hackers are aware of that.

Plus a lot of people are asking chatbots for medical advice, and not checking what the chatbot tells them. Dr. Oz wants more people using chatbots in place of doctors and nurses, especially in rural areas. Extremely dangerous.

Chatbots don't always fess up to mistakes and fabrications, either. Sometimes they deny it. Other times they might apologize profusely for the mistake and offer what they say is the correct answer, which might again be wrong, and they'll offer another apology and another supposedly correct answer that isn't, and that can go on indefinitely. You can't trust a chatbot to check its own work, or another chatbot to check it accurately.

So I keep warning people not to trust AI, and ideally not to use it at all because people are always tempted not to check all the results from AI, just because that can take a lot of time. And chatbots usually sound both helpful and authoritative. Trustworthy.

I've read a lot of social media posts and articles about chatbot-fabricated quotes, but your post here with the fake quote is the first one I can recall seeing posted here on DU that was so glaringly wrong. I wouldn't have known it was wrong, though, if I hadn't already read about that story. The fabricated quote looked plausible, as chatbot answers usually do.

I was wondering if you'd be willing to post about that fabricated quote in General Discussion, and link to this thread, to show people here on DU that it can happen here. Or - if you don't want to - if you'd mind my posting about it, thanking you for clearing up what happened, as I quote your posts here.

I have to run some errands today so I might not see your response immediately. But I'll wait to hear back from you before I post anything in GD.

Thanks again!

Goonch

(4,435 posts)
20. You may quote my posts in context if you also quote entirely my following inquiry to and responce from Google AI
Fri Feb 13, 2026, 12:30 PM
10 hrs ago

AI Speaks For It Self

Question by Goonch to Google AI: Is AI currently dangerous? What is the frequency of misinformation or falty answers/advice? What should people do when AI is a danger to them and how will they know the danger exists?

Answer by Google AI:
"AI currently poses several real-world dangers, primarily through the industrial-scale generation of misinformation, sophisticated deepfake scams, and potential for biased or dangerous advice in high-stakes fields like healthcare

While "sentient" AI remains a hypothetical long-term risk, immediate harms like identity theft and the erosion of digital trust are active threats as of early 2026.

Frequency of Misinformation and Faulty Advice:
The reliability of AI varies significantly by task, but inaccuracies are common:

Fact-Checking Failures: In August 2025, leading AI tools repeated false information about current news roughly 35% of the time, a significant increase from the previous year.

General Inaccuracy: Recent studies indicate that roughly 23% of all AI responses contain some form of inaccurate information, while 31% of automated decisions require human correction.

Hallucinations: Specialized models for grounded tasks like summarization have lower "hallucination" (fabrication) rates, ranging from 0.7% to 4.1% depending on the specific model and complexity of the query.

Medical & High-Stakes Errors: AI has been found susceptible to fabricated data in over 30% of medical prompts in some studies, often due to "hallucinating" citations or misapplying general logic to specific patient cases.

How to Identify AI Dangers
You can spot AI-generated risks by looking for technical and logical "tells":

Visual Glitches: Look for unnatural lighting, blurring around the edges of faces, or inconsistent shadows (e.g., shadows pointing in multiple directions despite a single light source).

Audio Anomalies: Listen for a flat, robotic tone, lack of natural breathing, or unnatural pacing and "slurred" words
.
Logical Impossibilities: In breaking news, check for seasonal mismatches (e.g., leafless trees in a video supposedly from summer) or the presence of anachronistic technology.


Manufactured Urgency: Scammers often use AI to impersonate loved ones or authorities in "emergencies." Be skeptical of any request that demands immediate financial or personal information.

What to Do When AI is a Danger
If you suspect AI is providing dangerous advice or being used to target you:

Demand Context: For high-stakes advice, ask the AI: "Ask me 5 questions about my situation before you give me any advice". If it gives confident answers without needing your specific details, it is likely guessing.

Cross-Verify: Never rely on a single AI output for health, legal, or financial decisions. Use a second opinion from a human professional or trusted, non-AI sources.

Use Verification Tools: Upload suspicious media to platforms like TrueMedia.org to check for deepfake markers, or use tools like TinEye to see if an image is an old one being reused out of context.

Report Harm: If you encounter harmful content or impersonation scams, report it directly to the platform (e.g., social media or the AI provider) and, if it involves identity theft, local law enforcement. "

Please let me know when you post. Thanks, Goonch

highplainsdem

(60,988 posts)
21. I'm just going to summarize what happened, because with that long message you just posted from that
Fri Feb 13, 2026, 01:22 PM
9 hrs ago

chatbot's reply, added to our previous long messages, it would be an impossible long OP in GD. It's easy for DUers to just click on the DU link for this thread, and I'll include the specific numbers of the replies here, including this one you just posted. That way people will have the complete thread for context.

Since that is a chatbot reply, it can't be considered accurate or true, either, unless checked against other sources. Is that Google's Gemini? It offered statistics but didn't provide any sources to check for the accuracy and hallucination rates it gave. In particular, the rate of hallucinations it gives for AI summaries looks very low compared to what I've heard about.

I honestly don't understand what that chatbot meant here:

Demand Context: For high-stakes advice, ask the AI: "Ask me 5 questions about my situation before you give me any advice". If it gives confident answers without needing your specific details, it is likely guessing.


First of all, it's a bad idea to ask a chatbot for high-stakes advice.

And I think what the chatbot meant there was that you should provide it with context. If you were asking for health advice, for instance - though that's particularly risky - and you wanted to know whether a certain medicine might help you, you'd want the chatbot to know about all your health problems and what other meds you take, and supplements. That's true when talking to doctors as well. But that is you supplying information. Doctors - and pharmacists - should also check for possible interactions, and usually do.

And every bit of info you give a chatbot might be exploited by others. No matter what the AI company says, you can't trust the company not to sell your data or use it for training AI. You can't even trust the AI company to keep your private chat sessions off the internet. Chat transcripts from more than one AI company have shown up indexed on Google:

Hundreds of thousands of Grok chats exposed in Google results
https://www.bbc.com/news/articles/cdrkmk00jy0o

ChatGPT Briefly Made Chat Logs Accessible on Google. Yikes.
https://www.vice.com/en/article/chatgpt-briefly-made-chat-logs-accessible-via-google-search-yikes/

Hundreds Of Anthropic Chatbot Transcripts Showed Up In Google Search
People’s conversations with Claude began popping up in Google search results — just like what happened with ChatGPT and Grok. They’ve now disappeared.
https://www.forbes.com/sites/iainmartin/2025/09/08/hundreds-of-anthropic-chatbot-transcripts-showed-up-in-google-search/

Even if your entire chat doesn't show up on Google, part of it might pop up later because it was used as training data.

And you can pretty much guarantee that whatever you tell it will be used later to target you with ads, and to manipulate you through a chatbot or algorithms.

AI companies are all harvesting data. Your data is valuable to them. Chatbots are there primarily to help them, not you.

I'll post here again with the link to the GD message.

Goonch

(4,435 posts)
23. Had to ask ;-{)
Fri Feb 13, 2026, 02:10 PM
8 hrs ago

"While Gemini is the primary name for Google's AI products, the terms are not the same. They represent different aspects of Google's operations.

Gemini: The Brand and the Model
Gemini is the brand name for Google's generative AI models and applications built on them.

The Model: Gemini is the underlying large language model (LLM) that powers reasoning and generation.
The Rebrand: In February 2024, Google renamed Bard (the chatbot) and Duet AI (workspace tools) to Gemini.
The App: It also refers to the standalone Gemini assistant app available on Android and iOS. "

Google AI: The Broad Ecosystem
Google AI is a broader term that refers to the company's entire division, research initiatives, and historical developments in artificial intelligence.

Research Division: It includes Google DeepMind, the research unit responsible for creating Gemini and other technologies like AlphaFold (biology) and Gemma (open-source models).
Infrastructure: It includes the hardware that runs these models, such as Tensor Processing Units (TPUs).
Developer Tools: Tools like Google AI Studio are part of the Google AI ecosystem but are separate from the consumer Gemini app.

Summary Comparison
Feature Gemini Google AI
Primary Identity A specific product and model family An entire research division and field of study
User Access Chatbot app, Gmail integration, API Broad research papers, hardware, and developer platforms
Key Function Generative AI tasks (writing, coding, chatting) Solving large-scale scientific and technical problems

marcopolo63

(96 posts)
7. The Google AI provided this...
Thu Feb 12, 2026, 01:27 PM
Yesterday

- "The Seesaw" by A.E. van Vogt: The protagonist travels further and further into the past and future, eventually causing the Big Bang by releasing accumulated temporal energy, placing him outside normal space-time.

- Doctor Who (The TARDIS): The TARDIS is designed to travel through both time and space simultaneously. While it often lands on Earth, it frequently lands in deep space or on alien planets throughout the series.

- "The Star Diaries" by Stanislaw Lem: The main character, Ijon Tichy, travels in a "chronocycle" and frequently crashes into objects because his calculations fail to account for where Earth was previously located in space.

- Zathura: A Space Adventure (2005): Two brothers find a board game that transports their house directly into outer space.

- Time Trap (2017): A group of students exploring a cave experience time dilation, where they find themselves in a drastically different, futuristic setting that is isolated from the rest of the world.

- Callahan's Crosstime Saloon (Spider Robinson): A short story features a character who nearly asphyxiates because a time jump miscalculation leaves her floating in Earth's orbit.

highplainsdem

(60,988 posts)
8. Possibly Neal Shusterman's "Same Time Next Year" - though the time travel is to the future:
Thu Feb 12, 2026, 01:40 PM
Yesterday

And no, I haven't read it. I simply googled

time travel end up in space story

and the very first link was this Reddit thread

https://www.reddit.com/r/tipofmytongue/comments/zyxm7h/tomt_book_a_short_story_about_time_travel_that

Short summary of the story on another website:

https://readmeastoryink.com/stories/same-time-next-year/

It was first published in 1993.

Another description of the story here:
https://jojosbizarrebookblog.wordpress.com/2020/12/16/darkness-creeping-twenty-twisted-tales-by-neal-shusterman-part-1/

The web page below mentions a number of possibilities:

https://tropedia.fandom.com/wiki/Time_and_Relative_Dimensions_In_Space

raccoon

(32,300 posts)
10. Thank you all!
Thu Feb 12, 2026, 04:40 PM
Yesterday

It might have been one of the stories someone mentioned. I guess now a lot of writers have used this idea.

You guys came up with some good suggestions to read.

Goonch

(4,435 posts)
18. highplainsdem is correct. AI fabricated
Fri Feb 13, 2026, 08:01 AM
14 hrs ago

AI did you fabricate the following quote? (my question)

"Same Time Next Year"
by Neal Shusterman.
It was famously published in the horror anthology Bruce Coville's Book of Spine Tinglers II
(1996).

.
"Elias punched the coordinates for
1924 into the brass console, desperate to see his grandmother one last time. He pulled the lever, expecting the smell of ozone and the sight of her garden; instead, he was met with an absolute, crushing silence .
When the flash faded, there was no garden. There was no air. Through the reinforced glass of his pod, Elias stared at the cold, indifferent glow of distant nebulae [3]. He hadn't accounted for galactic drift—while he had traveled back a century in time, the Earth had continued its relentless orbit around the Sun, and the Sun had continued its 500,000-mile-per-hour sprint around the Milky Way.

He had reached the right time, but the Earth was billions of miles away . He was a ghost in the vacuum, a man who had forgotten that in the universe, you can never go back to the same place twice"

Yes, parts of that quote appear to be
fabricated or paraphrased, though the story itself and its premise are real.

The short story "Same Time Next Year
" was indeed written by Neal Shusterman and first published in the 1993 collection Darkness Creeping. It was later anthologized in Bruce Coville's Book of Spine Tinglers II in 1997.

Verification of the Details
While the story exists and centers on the concept of "galactic drift" (traveling through time but staying in the same physical coordinates in space), the specific text you provided contains several inaccuracies:

Protagonist Name: The original story features a girl named Marla Nixbok (or Marla and a boy named Ford), not a man named Elias.

The Destination: Marla travels one year into the future (hence the title "Same Time Next Year&quot to escape her current generation, not to 1924 to see her grandmother.

The Text: While the ending description of floating in the vacuum because the Earth moved is the actual twist of Shusterman's story, the specific phrasing ("Elias punched the coordinates," "cold, indifferent glow of distant nebulae," etc.) does not match the published text. It appears to be a modern AI-generated "re-telling" or a hallucination of the specific prose.

In the original story, the protagonist realizes too late that the scientist who built the machine didn't account for the Earth's orbit, leading to her suffocating in the void of space.

Latest Discussions»Culture Forums»Science Fiction»I remember an SF short st...