Timer issue out of control - seems to be initializing by default

Hi,

I am trying to use a timer to make the screen shudder for a second when the player lands on a cracked block. I am starting a timer “ScreenShake” when the player lands on the block, and checking if value of ScreenShake < 1 s then run and if value of screenshake >=1 then delete “ScreenShake”. However, when I run the program, the screen shudders as if the timer is ON by default and the condition ScreenShake < 1 is being checked. If a timer is not initialized, but the code encounters a condition to check on the timer value, then does GDevelop by default consider the value of the timer to be zero, and is that why my code is failing?

Please help!

Thanks!
Samik.

GDevelop creates/starts a timer the first time it reads an event using that timer.
Add a condition to check collision of the player with cracked block, perhaps?

Thanks for your reply Gruk. That’s the weird part - I am starting the timer as a sub-event only when the player is colliding with cracked block, but checking for the timer value outside that condition, because I want the screen shudder to continue even after the cracked block is deleted. But when I start the game, the screen starts shuddering as if the condition of when “ScreenShake<.5” is returning true even when the colliding with cracked block condition is not executing. Below is a copy of the code

Sorry - here’s the actual code. The one before I was trying out some other alternatives to debug.

As I said, the first time GD reads your timer condition, it creates and starts the timer if it doesn’t already exist.
So you need to add another condition. If the collision condition is not a good fit for you, use a variable. Collision turns variable screenshake to one, and you reset it to zero when you delete the timer.

Thanks - I did that and it works. Weird that GDevelop creates a timer when it reads the condition - maybe it should be changed that the timer is created/rest ONLY when the code explicitly starts/resets it?