1 (edited by Computron59 2017-06-30 17:16:16)

Topic: TIme measurement in Neverball --- edit

I am making an AI to beat Neverball. Open attachment for video link.

I am taking time from "curr_clock" in game_client.c  and it gives me game time
but when I call files with recorded tilts in them they get called at slightly different times and this leads to a wrong path.  Does anyone know or know anyone who knows how time is measured in Neverball? This would help me greatly.  I am calling my files at a timestep of 250ms and getting mixed results.

Please feel free to PM me or email computron59@gmail.com

Edit --- I find this happens more when the game is sped up!!!!
---------still a problem but not as chronic at lower speeds..

Am I speeding up the game in the correct way?

Post's attachments

Attachment icon linkforVideo.txt 32 b, 5 downloads since 2017-06-30 


Re: TIme measurement in Neverball --- edit

It's a little difficult to respond without knowing all the details, but there's a couple of things that could go wrong.

First of all, the authoritative time is tracked in ball/game_server.c, which is where all of the actual simulation happens. game_client.c is merely an endpoint for replay commands, which come from either game_server.c or demo.c.

In game_server.c you should also find the input struct, which can easily be written to a file. This was originally how replays were recorded. We switched away from recording input for one reason only - it was unreliable. It was unreliable for frame-perfect updates (Neverball simulation runs at 90 frames per second), so it's unsurprising that it would be unreliable for anything with a lower update rate (250 ms). At the time, the  culprits were x86 extended floating-point and GCC optimizations giving unpredictable results between machines, between platforms, even between builds on the same platform and the same machine.

Long story short, I'd look into using SSE floating-point if you haven't (SSE is default on x86_64 and generally very predictable) and doing frame-perfect updates instead of 250 ms. The reason it's getting worse at faster game speed also probably has to do with the 250 ms update rate, since more time passes in-game during the 250 milliseconds.


Re: TIme measurement in Neverball --- edit

Hello Parasti!!

We meet at last.

Yeah. Thanks for telling me where to get the time, I'll try that out. Not 100% on what you meant with SSE floating point, I guess I'll google it till I understand.

I will do some testing and get back to you.

Did you see the video by the way?


Re: TIme measurement in Neverball --- edit


I am really struggling here.

I am calling logfiles with tilts in them but they get called at odd times and cause an odd path.

I am taking the time from game_server.c, the "game_update_time"  function. And it seems to update every 0.01111111111 seconds which tallies with the 90Hz mentioned.

I am still getting unusual behaviour.

after about 20-25 seconds the behaviour deviates drastically each run and the AI cannot learn.

Parasti mentioned a "frame perfect update"  and I have learned a lot about SSE but I am still bamboozled as to how I should get the path to recur faithfully. 

How can I get these logfiles called at reliable times?

The problem is time. If the time is greater than T-interim*Ivalue I increase Ivalue and run the loop . and this time seems to deviate and not occur regularly.

Should I be using ticks to calculate time?

Please help.