BigFix Insights Troubleshooting

This topic helps you in troubleshooting various issues encountered in BigFix 10 Insights.

App Log Location

BigFix Insights application is an app of the WebUI Server that is configured as the ETL server. As such, logs are located within the native WebUI Location. This should generally be located within the following directory:

<Installation Drive>:\Program Files (x86)\BigFix Enterprise\BES WebUI\WebUI\logs
Note: The WebUI application logs are configured to be stored in an alternative directory. The BigFix Insights application logs are present within the alternate directory if the WebUI application logs are configured to do so.

BigFix Insights maintains a log within the log directory. The application name is Insights, thus the log is called insights.log within this directory. The log is configured to roll as the logged events grow. Depending on the total size of the log set, you may notice log files that are suffixed within the integer representations (that is insights.log.1, insights.log.2, and so on). Take note of the file modification time when evaluating the log file as the log rolls forward.

Advanced Logging

To enable verbose logging, you should modify the WebUI settings _WebUI_Logging_Filter to include bf:insights* (including asterisk). The application or the WebUI service should be restarted for the changes to take effect. This log output is very expansive, so it is advisable that the setting should be returned to its previous value once your debugging session is ended.

The insights application log output will be made to be verbose.

Note: ETL logs are now included in the same log file as the main Insights app logs, prefixed with bf:insights-etl.

SQL Tables of interest

ETL_Metrics

BigFix Insights maintains a table log of all running and completed ETLs. This data is captured within the ETL_Metrics table. This table should never be modified or altered manually. This table can be located on your Insights SQL Server Database, within the ETL_Metrics table. The following SQL query can be used to select information from this table:
SELECT TOP 1000 [id]
      ,[datasource_id]
      ,[start_time]
      ,[end_time]
      ,[duration_ms]
      ,[status]
      ,[detail_log]
,[preflights]
  FROM dbo].[etl_metrics]
The ETL_Metrics table documents each ETL from each Datasource and maintains a detail_log column that represents the metrics for the given ETL. The column value is structured as a JSON value. The following is an example of the JSON value that is written per ETL.
{"step1":{"entity":"DatasourcePropertyResult","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:19.463Z",
"endTime":"2020-03-28T18:26:20.363Z","durationMs":900},
"step2":{"entity":"ActionStateString","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:20.376Z",
"endTime":"2020-03-28T18:26:20.503Z","durationMs":127},
"step3":{"entity":"DatasourceSite","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:20.516Z",
"endTime":"2020-03-28T18:26:20.743Z","durationMs":227},
"step4":{"entity":"DatasourceActionsiteProperty","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:20.743Z",
"endTime":"2020-03-28T18:26:20.870Z","durationMs":127},
"step5":{"entity":"DatasourceDevice","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:20.980Z",
"endTime":"2020-03-28T18:26:21.103Z","durationMs":123},
"step6":{"entity":"DatasourceFixlet","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:21.103Z",
"endTime":"2020-03-28T18:26:24.896Z","durationMs":3793},
"step7":{"entity":"DatasourceAnalysis","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:24.910Z",
"endTime":"2020-03-28T18:26:25.053Z","durationMs":143},
"step8":{"entity":"DatasourceAnalysisProperty","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:25.053Z",
"endTime":"2020-03-28T18:26:25.210Z","durationMs":157},
"step9":{"entity":"DatasourceAction","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:25.223Z",
"endTime":"2020-03-28T18:26:25.633Z","durationMs":410},
"step10":{"entity":"DatasourceGroup","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:25.633Z",
"endTime":"2020-03-28T18:26:25.693Z","durationMs":60},
"step11":{"entity":"DatasourceComputerGroup","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510",
"startTime":"2020-03-28T18:26:25.693Z","endTime":"2020-03-28T18:26:25.756Z","durationMs":63},
"step12":{"entity":"DatasourcePropertyMap","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:25.773Z",
"endTime":"2020-03-28T18:26:25.900Z","durationMs":127},
"step13":{"entity":"StagingFixletField","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:25.913Z",
"endTime":"2020-03-28T18:26:27.510Z","durationMs":1597},
"step14":{"entity":"StagingFixletResult","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:27.526Z",
"endTime":"2020-03-28T18:26:27.590Z","durationMs":64},
"step15":{"entity":"DatasourceActionResult","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:27.590Z",
"endTime":"2020-03-28T18:26:27.763Z","durationMs":173},
"step16":{"entity":"ContentResult","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:27.780Z",
"endTime":"2020-03-28T18:26:28.030Z","durationMs":250}}
The log is viewed easily in a JSON viewer (you get several applications online). The JSON is structured as an array of variables, where one step is a variable within the array. An example of a given step variable within the array is:
{"step1":{"entity":"DatasourcePropertyResult","startSequence":"0x000000001fc08879",
"endSequence":"0x000000001fcd3510","startTime":"2020-03-28T18:26:19.463Z",
"endTime":"2020-03-28T18:26:20.363Z","durationMs":900},"

The above example indicates that step 1 is the DatasourcePropertyResult step. The ETL was configured to ingest from sequence 0x000000001fc08879 and ended at sequence 0x000000001fcd3510. The step started at 2020-03-28T18:26:19.463Z and ended at 2020-03-28T18:26:20.363Z. This comprised a total duration of 900ms (or .9 seconds). The same principle is used to analyze the entirety of the resulting log value. Understanding the results can provide some context for ongoing ETL operations. The Sequence start and end range indicate the data that was retrieved within the step. Sequences are logged within the ingesting BFE server and represent a means to identify if something has changed, and therefore must be imported. If the Start and End range sequence are the same value, this indicates there is nothing that has changed within the resulting BFE database to ingest.

Problem sets and potential remediations

Table 1. Problem sets and potential remediations
Problem or Issue Course of Action

Issue in deploying the Insights DBMS

Verify the user credentials that are provided in the Insights setup has the appropriate permissions to create the target DB on the Insights SQL Server. Verify the proper target Insights SQL Server is accessible over the port specified (if none is specified, port 1433 to be used). A simple verification exercise is to review the SQL logs on the Insights DB server to verify the connection is being made and authentication is taking place (SQL server can be configured to audit successful and unsuccessful logins).
Data not updating for a given datasource Insights ingests from a database target that is a hosted replica of the live BigFix Enterprise Server. Verify the replica has been updated with a recent dataset (by the backup or replication etc). A simple examination of the ETL_Metrics table can confirm if data has changed on the ingestion target. The ETL_Metrics table logs every ETL to every datasource. The section above itemizes how to interpret the data within that table. Comparing the previous successful run step with the most recent step sequence id can confirm if data has indeed evolved on the underlying database. If the data is changing, Insights should be ingesting from the datasource, review the ETL logs and confirm that the ETL is successfully completed. If the ETL is not completing successfully on a repeating basis contact support.
Unable to delete a datasource Within BigFix Insights there is a concept of a primary linked item. The primary linked item is assigned when an external site is selected to be ingested. BigFix Insights does not allow a user to remove a datasource that is associated as the primary for a given site. The primary for the given site must be first reassigned to another datasource before the datasource can be deleted.
Unable to connect to a datasource Insights retrieves information from a corresponding datasource. Verify that the user credentials provided are correct and has the appropriate permissions to read from the target datasource DB. Verify that the proper target datasource SQL Server is accessible over the port specified (if none is specified, port 1433 should be used). A simple verification exercise is to review the SQL logs on the datasource DB server to verify that the connection is being made and authentication is taking place (SQL server can be configured to audit successful and unsuccessful logins).
“Start Fresh” with Insights – Revert to a “New Install” because there are datasources that you want to remove To completely reset, the customer should run the following query against their BFEnterprise database to which the WebUI reports:

delete from [BFEnterprise].[dbo].[webui_data] where App = 'insights'

The Insights database can be deleted or retained. When you re-visit the Insights WebUI Application, you should either connect to an existing Insights database or create a new database. If you want to retain your previous Insights database, provide a new name to the newly created Insights database.