Intervals vs timers
Intervals and timers serve different purposes. Here’s a brief explanation of each:
- Intervals: Intervals refer to fixed periods of time, such as seconds, minutes, hours, or days. Intervals are used to represent time periods for gathering and analyzing data in various applications, including trading algorithms. For example, you may want to analyze market data using a 15-minute interval, which means that you’ll be looking at data points collected every 15 minutes.
- Timers: Timers, on the other hand, are used to schedule the execution of specific actions or functions in a program after a certain period has elapsed. Timers are useful when you need to perform tasks at specific time intervals, such as updating data, sending requests, or executing trade orders. In Lua, you can create timers using libraries or external modules, as the core language doesn’t include built-in timer functionality.
For example, you can use the command OptimizedForInterval to limit updates to occur at specific intervals:
local interval = InputInterval('SMA Interval', 15)
local c, sma = OptimizedForInterval(
interval, -- run the following function every X minutes.
function()
local c = ClosePrices(interval) -- grab same interval prices
local sma = SMA(c, 20) -- calculate SMA
Plot(0, 'SMA', sma, Purple) -- plot SMA
-- return the price and SMA
return c, sma -- returned values are cached until the next run.
end
)
-- Plot the same SMA again on every update cycle:
Plot(0, 'SMA_Spam', sma, SkyBlue)
-- plot the X minute interval close prices:
Plot(0, interval..'min Close', c, DarkGray)
With timers, you can schedule actions:
if not init then
globalTimer = Time()
globalTimerInterval = 5 * 60 -- 5 minutes
init = true
end
local loadedTimer = Load('timer', Time())
local loadedTimerInterval = 15 * 60 -- 15 minutes
if Time() >= globalTimer + globalTimerInterval then
Log('globalTimer: TICK')
globalTimer = Time()
end
if Time() >= loadedTimer + loadedTimerInterval then
Log('loadedTimer: TICK')
Save('timer', Time())
end
In summary, intervals are fixed time periods used for data analysis, while timers are tools for scheduling the execution of actions or functions at specific time intervals. Although they serve different purposes, they can be used together in various programming scenarios, such as trading algorithms or other time-sensitive applications.