Writer Sanctum

Other & Off-Topics => Bar & Grill [Public] => Topic started by: Post-Doctorate D on September 17, 2019, 08:35:56 AM

Title: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Post-Doctorate D on September 17, 2019, 08:35:56 AM
That's my current dilemma.

I'm making an animated GIF.  If I set the frames to last 1/100th of a second each, the animation seems choppy.  If I set the frames to last 1/50th of a second each (which is 2/100ths of a second each), the animation is sparky like it should be.

I have another part of the animation set at 5/100ths of a second per frame and it runs faster than the frames set at 1/100th of a second.

I guess I'll set the fast frames at 2/100ths of a second each and live with it.

:shrug
Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Lynn on September 17, 2019, 09:02:32 AM
Well, in decimals, it's easier to see. :)

1/100 = 0.01
2/100 = 0.02 or 1/50 = 0.02

If a frame lasts .02 seconds, it's longer than a frame that lasts .01 seconds. :D

So the longer the frame lasts, the slower the change between frames. BUT you have to take how many frames you have into consideration.

If you have the same number of frames, say 100, at .02 seconds, it's going to last 2 seconds before going back to the beginning and starting over.

Those same frames at .01 seconds are going to last 1 second. So the animation is going to be moving faster, even though the individual frames stay in place longer.

In other words, the longer the delay, the slower the animation. :D

Unless I'm all messed up and wrong. But I don't think so. :D

Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Post-Doctorate D on September 17, 2019, 09:05:06 AM
In other words, the longer the delay, the slower the animation. :D

Yes.

So why do 5 frames playing at .01 seconds each play slower than 5 frames playing at .02 seconds each?  The runtime for the first should be .05 seconds and .10 seconds for the latter.
Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Lynn on September 17, 2019, 09:11:16 AM
I'm stumped.  :icon_think:

Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Jeff Tanyard on September 17, 2019, 10:24:04 AM
I have a theory.

Set the rate at 1/60 and report your results.

Then set the rate at 1/120 and report your results.
Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Post-Doctorate D on September 17, 2019, 11:56:38 AM
I have a theory.

Set the rate at 1/60 and report your results.

Then set the rate at 1/120 and report your results.

The only option in my software is to set each frame's display length in terms of x/100ths of a second.
Title: Re: Why is 1/100th of a Second Slower than 1/50th of a Second?
Post by: Jeff Tanyard on September 17, 2019, 01:02:09 PM
I have a theory.

Set the rate at 1/60 and report your results.

Then set the rate at 1/120 and report your results.

The only option in my software is to set each frame's display length in terms of x/100ths of a second.


Well, my theory is that your computer display's refresh rate is 60 Hz, so you'd want your gif rate to be a multiple of 60 for best results.   :shrug