Hello, I had a weird coversation with Gemini 2.5: AI continued to state that date TODAY is May 2024. when I said the date is 29 march, 2025, it talked about me being in a hypothetical future. I then fed in the link timeanddate .it confirmed it could access it and that the result was may 2024. i sent the screenshot of my phone stating today‘s date, it was declared „hypothetical“.
I copied the code, if anyone is interested. How to comunicate the developers? This is SCARY.
Welcome to the forum.
Every LLM (not just the Gemini models) doesn’t have an internal clock. Time stops when the model training concludes and the model parameters are frozen. Every day after that point in time is the day the training stopped. To give it proper temporal context, add a sentence like “Today is April 1, 2025” in the system instruction.
Thank you! Yes, I was clear about this. What made me wonder was that Gemini continued to insist,: even when I told which day it was, it continued to talk about a “speculative future”. It did not accept my today’s knowledge (Trump now president of the US), declaring it “speculative”. I am working in Typing Minds, which through plugins gives the LLMs the possibility check for the content of a webpage. When Gemini checked www.timeanddatecom, it confirmed it had ‘read’ the page, and that the page had given back the 2024 date it had stated earlier.
The solution will be in your sentence "Every day after that point in time is the day the training stopped. "
Yet, weird that actually checking date and time today, it only receives that past moment.
Thanks again!
The system instruction sets up the operating framework. When I tell it what “now” is in the system instruction, relative dates become meaningful again. The model “believes” what it is told in the system instruction.
Hope that helps!
But it still doesn’t work. If you add to the system instructions something like “Today is Tue Apr 15, 2025. The time is 9:30 AM.” it still assumes a completely different date for today. E.g. if you ask what’s today’s date?" its still ignores the system instructions. Gpt4-mini gets it right.
Welcome to the forum. When the system instruction has provided Gemini 2.5 with the current date, a prompt like this works properly:
A biological experiment was started on April 1, 2025, and it needs 20 days to mature. How many days are left before the experiment is ready and the results can be measured?
Gemini 2.5 on April 17th then answers as follows:
QUOTE
Okay, let’s break this down:
-
Experiment Start Date: April 1, 2025
-
Experiment Duration: 20 days
-
Experiment Ready Date: April 1 + 20 days = April 21, 2025 (The experiment is ready on the morning of the 21st, after 20 full days have passed).
-
Today’s Date: April 17, 2025
-
Days Passed: From April 1st to April 17th inclusive, 17 days have passed.
-
Days Left: Total Duration - Days Passed = 20 days - 17 days = 3 days.
The remaining days are April 18th, April 19th, and April 20th. After April 20th is complete, the experiment will have matured for 20 full days.
So, there are 3 days left before the experiment is ready.
END QUOTE
The point is, the model can then properly reason about time differences, the age of actual people given their birthday etc.
Also, if I use the simple prompt “What is today’s date?”, Gemini 2.5 (with the system instruction previously provided) responds with
Today’s date is April 17th, 2025.