WSF_DELAY_PROCESSOR

processor WSF_DELAY_PROCESSOR
processor <name> WSF_DELAY_PROCESSOR
   ... processor Commands ...
   ... Platform Part Commands ...
   ... External Link Commands ...
   ... WSF_SCRIPT_PROCESSOR Commands ...

   queuing_method ...
   number_of_servers ...
   time_distribution ...
end_processor

Overview

WSF_DELAY_PROCESSOR is used simulate that it takes a finite amount of time to process information. When the processor receives a message (e.g., image, track) it will hold the message to simulate processing time and then forward the message via its internal and external links.

Typically, the initial producer of a message such as a sensor would link itself to this processor and this processor would link itself to the consumers.

Commands

queuing_method [ first_in_first_out | last_in_first_out | none ]

Specifies how incoming messages are to be queued if all of the servers are busy. A value of none indicates the message will be discarded.

Default: first_in_first_out

number_of_servers [ <integer> | infinite ]

Specifies the maximum number of messages that can be ‘in process’ at any given instant of time. If a new message is received and all of the servers are busy, the message will be queued according to the queuing_method

If ‘infinite’ is specified (the default), received messages are simply delayed by the required amount before being forwarded.

Default: infinite

time_distribution constant time <time-value>

Defines that simulated processing time will a constant value with the specified time.

Default: The default time_distribution is ‘constant time 0 seconds’

time_distribution uniform minimum_time <min-time-value> maximum_time <max-time-value>

Defines that the simulated processing time for a message will be drawn from a uniform distribution with the specified range.

time_distribution gaussian mean_time <mean-time-value> sigma_time <sigma-time-value>

Defines that the simulated time for a message will be drawn from a Gaussian distribution with the specified mean and standard deviation.

time_distribution log_normal mean_time <mean-time-value> sigma_time <sigma-time-value>

Defines that the simulated time for a message will be drawn from a log-normal distribution with the specified mean and standard deviation.