There are two parts to data analysis:
1) Loading the data into the database for analysis;
2) Running the analysis module to derive results.
The FORCES data analysis module does not automatically save data from a run nor does it automatically load the data saved from a run into the database. Before making a run it is worthwhile verifying that the appropriate data will be saved. To do this enter the "Mission Planner" mode for the selected scenario ( click here for instructions ). Then go to "Scenario Defn"->"Data Collection". Click here to see the widget that will be displayed. What follows is a quick description of the widget's use to get a first time user started.
First, verify that the flag to collect data is on - otherwise you'll have problems. You can usually ignore the sampling frequency - this only affects how often data is collected for detections when there is no change of detection status; all key events and changes will be saved on messages being collected regardless of this setting. Typically I set this value to the largest number possible, though setting it to a lower number will provide smoother display on the "Detection Profile" report - described later. Then verify that the data will be saved in a file and directory that exist on the system. If the directory does not exist the data can't be collected. But using this widget to select a location to save the data will guarantee that the file is being saved to a legal location. If you have any questions about whether the data is being collected just verify the data is in this location after the run by typing the following in a terminal window after a run:
"ls -l (filename)"
If the file exists, you're fine. If not, check permissions on the file.
Next, verify that the messages desired are to be stored. The choice "allmsgson" is a default indicating data on all messages should be kept. To review the any other selections click on the "Edit/Create Message Set" button. A list of the messages will appear along with a brief description of the information saved. The first several messages will be written in light gray in this interface, indicating that they can not be turned off. Those written in black are user selectable.
Finally, verify that data is being collected on assets of interest. FORCES permits the user to collect data on subsets of assets in order to minimize the size of the collection file. Data collection filtering can either be by individual assets or classes of assets (e.g. all bombers of a specified type). A common default is setting the data collection filter to be by class and selecting "allclasseson" as the filter. This will collect data on all scenario participants. Other filtering options can be set by tailoring the class and object sets as desired.
As stated above, FORCES does not automatically save data from runs into the database. This is done to increase the runtime speed. The data collected by a FORCES execution is stored in the file specified by the user during Mission Planning under the "Scenario Definition->Data Collection" widget. For a number of reasons (including disk storage limitations) the data collected can be tailored according to event types (also called message types) and the involved assets. Verify (under this interface) that 1) you're collecting data, and 2) the data you're interested in has not been eliminated by these filters before making your run(s). To check what data (if any) is being collected during a run refer to the previous section.
Generally there are two ways of loading the stored data from a run into the database. The first is to use a batch script (e.g. batch_run ) that automatically stores data from a run into the database at the successful conclusion of that run. The second is to load the data through the FORCES interfaces, as shown here .
This is an unusual interface in that the widget presented differs according to whether the system is in a runtime mode or any other mode. If the system is in a runtime mode this screen is shown. The point is that the scenario already has file information and doesn't need to query the user for any details. However, this method assumes that the simulation is still running; this might not be the case for a number of reasons (e.g. The simulation has already achieved it's maximum duration as specified by the user during the "Mission Planner" mode). To test this before collecting the data do the following:
1) Pause the simulation
2) Verify the simulation's still running by clicking on "System Controls"->"Monitor Execution"->"Verify Connection". If a message stating "Sim Responding" appears in the blue message area in a few seconds, proceed to step b. Otherwise you'll have to restart the FORCES interface and load the data as described in the next paragraph.
3) Load the data by selecting "System Controls"->"DBA"->"Scenario Maintenance"->"Load Data for Analysis". You should receive notification that the simulation is paused to load data and then a second message indicating that the data load is completed. At this point you can either:
i) Resume the scenario by clicking "System Controls"->Resume, or
ii) Perform analysis on the loaded data by selecting "System Controls"->"DBA"->"Scenario Maintenance"->"Run Data Analysis". The data just loaded will be the highest run_id in the appropriate database.
If the simulation is not running (either the data was saved from a previous run or the simulation has terminated in the current tun) you should see this interface when you select the "Load Data for Analysis" option. If you don't you'll have to restart the FORCES interface. Then select "System Controls"->"Data Analysis"->"Load Data for Analysis". Unlike when the scenario was running you'll be presented with a widget to select the file to load for analysis. This file should be the same file you specified for data collection in the "Mission Planner" on the widget shown here . This interface will also permit you to put in a run description. T
The inputs are:
Selection causes the Select Input Datafile Form to appear where the Filename is entered. This is the file specified by the user during Mission Planning under the "Scenario Definition->Data Collection" widget. If you don't know what it is, go back there and look.
Selection causes the Select DB Form to appear where the DB is selected. Typically the same database the scenario is defined in is used, though this is not required.
This is a free format description (limited to 60 characters) to make it easier to track data from a series of runs later.
Note: If you're running a series of runs using the batch_run utility, you do not have to manually load the data for analysis; this script does that automatically.
Go to data analysis by selecting:
"System Controls"->"DBA"->"Scenario Maintenance"->"Run Data Analysis"
This widget should appear. This is the main entry for performing the standard FORCES data analysis. Before describing this it should be mentioned that the critical data generated by FORCES is stored in a relational database with the intent of aiding user's in generating their own tailored report. FORCES provides a range of standard reports, but if you don't see what you need, FORCES currently can save data on over 60 events and stores over 600 datafields. Given the mix-and-match techniques available through a relational database, there're a HUGE number of reports that could be generated. To see what data is being collected, see the section on the "Quick Plot " button.
What follows is an overview of how to run the current data analysis package.
Data analysis is available during all running modes including Mission Planning, Scenario Execution, or when no mode has been selected. To run data analysis fill in at least the first four buttons, namely:
Both the Scenario & DB button brings up the Select Scenario Form which lists all the scenarios which are available to select from. Each scenario name and database associated with it are listed. By highlighting one and selecting the Accept button (or double-clicking), the choice is made. Note, don't worry too much about selecting the right scenario at this point; you'll have another chance to select the scenario when you pick the run ID. Note also that you don't have to pick both the database and the scenario button - either will do to fill in both fields.
Both the Run ID & Run Description button brings the Run Selection for Data Analysis Form . This choice presents the scenario chosen earlier along with the available versions of it. Included for each is the ID, Scenario , and Description which the user may select. Again, you don't have to pick both the database and the scenario button - wither will do to fill in both fields. Also, you're permitted to change the scenario at this time.
At this point you have the ability to either generate plots and related screen reports based upon the run selection or specify the data to be reported in a tabular file. The fields appropriate to either of these modes are identified in this diagram. There's nothing preventing an analyst from starting in one mode (e.g. Looking at the plots) and then going to the other (e.g. saving the tabular results in a file.) The intent of saving the data in a tabular format in a file is to aid the analyst in transferring the results to other analysis tools, like Excel. If the user wants to generate an output file of tabular data he should first select an output file by clicking on that button. Then he can select from the following options:
Aggregate Reports (The timeline options will provide data over time based upon hour and/or day)
Loss Summary - Losses by category and platform type per side.
Kill Summary - Kills by category and platform type per side. This identifies the most effective attack elements
Detection Summary - Identifies in an aggregate sense what elements where detected and which assets detected enemy assets
Sortie Summary - Identifies sortie rates by aircraft type and airbases
Chemical Attack Summary - Identifies who succeeding in making chemical attacks and the affected assets & resources.
Multirun Loss Summary - Like the Loss Summary (above) but includes means and standard deviations across multiple runs.
Specific Unit Loss Summary - Identifies losses in each reporting unit. Generally this means per Army TO&E
Specific Target Loss Summary - Identifies strategic targets lost and when it happened.
Specific Target Kill Summary - Identifies who killed strategic target.
The following data can be collected either for each day or hour of the scenario. This subdivision is selected by the "Daily Reports/Hourly Reports" radiobuttons at the top of the list. Note that even the multi-run report can be broken down this way. If aggregate reports are all that are required, use the summaries listed above.
Losses over Time - Losses by category and platform type per side.
Kills over Time- Kills by category and platform type per side. This identifies the most effective attack elements
Detections over Time - Identifies in an aggregate sense what elements where detected and which assets detected enemy assets
Sorties Versus Time- Identifies sortie rates by aircraft type and airbases
Chemical Attack Timeline - Identifies who succeeding in making chemical attacks and the affected assets & resources.
Multirun Loss Timeline - Like the Loss Summary (above) but includes means and standard deviations across multiple runs.
Specific Unit Loss Summary - Losses per unit (e.g. Brigade) broken down by time interval.
The results will be run and all reports will be saved in the same file. These reports tend to take a while on large scenarios, so don't jump to the conclusion that something's broken just because the system and interface stops responding. In addition to the data, the file includes a description of each of the fields including key information like the reason codes for assets being lost, etc. At this time the tabular file is formatted to make it easy to load into Excel, not read directly (though it's not terribly ugly), but format suggestions are welcome
Alternately, the user has the option of evaluating results interactively.
The Engagement Profile button causes the Engagement Target List Form to appear which allows the user to designate the asset required to for the Engagement Profile. Only Air-to-Air and Surface to Air engagements are considered at this time - no ground engagements. Picking a target will give an output like this sample . Further options can be assigned to this specific action via the appropriate off/on button and then re-double-clicking on the asset of interest.
Animate - Updates the map interactively to give an indication of the temporal order between events
Sort by Target - Changes the list of assets (and the resulting report) so events are organized around a target are tracked instead of the events around the attacker
Sort by Asset/Site - changes the asset list and reporting results so events are are organized around an attacker instead of around the target.
In any case, the results are displayed in two coordinated windows. In the larger (map) window the geometry of the events is displayed with light blue indicating the attacker and red indicating the target regardless of color. Lines are drawn (when appropriate) between the position of the target and the attacker at individual events. In addition an event log is reported in a separate window above the map. These widgets are coordinated in that an event can be highlighted in the log window and the corresponding geometric positions are highlighted on the map. Also events can be double-left-clicked on the map and the corresponding target (or attacker) location will be highlighted and the associated entry in the log window will be highlighted. Clicking on the map positions using the center and right buttons will provide additional data on the map, including event time and locations. The map by default is drawn in black but an interface on the left of the map widget permits the user to put other map backgrounds under the display. Click here for an exmaple. To add to or modify this list put a new map background (.gif images are supported) into the $MAP_DIR directory and add a line in $MAP_DIR/library.dat describing this map terms of the lat/long of the corner points.
The Scenario Log button causes the Select Element Form to appear with the Select Report Element Type data listed. The Asset or Group may be activated. An additional log may also be added by selecting the Additional Log button. In any case, a detailed chronology of events related to the group or asset selected will be presented. This data may be printed or stored to a file.
When a Scenario Log is generated for a group the following is provided in addition to the event log:
A plot of the strength of the group as a function of scenario time.
A barchart of the initial and final strengths
A listing of the initial and final strengths of this group and all reporting subgroups. The strengths are divided into "organic" and "composite" strengths. The organic strengths relate to the assets directly composing this unit; the composite strength adds the assets that report to any subgroup of this unit to the organic assets.
Event lists for all reporting subunits.
While this makes the report long, generally it's found that the extra information is required to understand the scenario results in detail.
The Detection Profile button causes the Target List to appear along with the data identifying the asset. The controls and information associated with this button are similar to those for the engagement profile button. Again, interactive coordinated map and log listings are generated for all key detection events and again the reports can focus on either the target or the sensor site and can be immediately generated or animated. Again,GIF map backgrounds can be used when desired.
The Quick Detections selection brings up the Target List Form with Pick target option. This button will provide a detailed detection log against any target.
The Comm Band Data button provides comm band utilization statistics, but it relies on the comm band definition tables "comm_bands", "comm_available" and "comm_defaults" in the database being loaded before the scenario is executed. To load these enter the Mission Planner mode and go to "Scenario Definition"->Communications and fill in "Define Bands" and "Define Default Loading" and fill in the values. Without this data the results will be as unimpressive as this sample output .
Quick Losses - These charts show the losses to each side by category, platform type and strategic target for both sides. It will also generate a losses over time plot for loss comparison. A number of charts are displayed as result of a single click. Data by both the individual platform types and categories and key targets are provided as well as a plot of losses over time. Click here for an example.
The Quick Kills button will generate kills by category and platform type for each side. This is intended to indicated the principle engagement assets in the scenario. But be forewarned, because FORCES uses a modified Lanchester equation for direct-contact ground war results, the losses can not be correctly assigned to specific assets or event asset types in an aggregate conflict. Therefore the numbers shown indicate kills by aircraft, SAMs, missiles and artillery, but not direct-fire weapons like tanks. Formats are similar to the Quick Losses.
The Quick Plot button causes the Data Analysis Plotter Form to appear. This function was intended to give the non-SQL user the ability to look into the database and generate some simple results in a free form mode. While this is occasionally done, the normal and more valuable use for this interface is for the user intending to create a new report based upon the data tables in the database to review the data kept by table (as shown in this diagram). When he finds tables of interest he can double-click on the table and a description of the fields in each table is presented. See Table w/Description f or an example. If he wishes to use the quick plot for a quick look into the data he is given a variety of options for grouping and plotting the data as well as filtering the data. But again, this tool is limited by not having a table-joining interface. This has not been developed because the table-joining utility in the pgaccess database utility has proven adequate to support non-sql users.