Skip to main content

I have put together a query to enhance inventory by adding LDoS dates and ‘approved software’. It has greatly improved our Life Cycle Mgmt. conversations.  The CSV Files reside separately in a ‘standards’ folder. (there are just too many fields to keep in the same script).

The resulting output provides:
Vendor | Hostname | Model | Location | Vendor LDoS Date | HW Compliance Date | Current OS | Approved OS | Compliance to OS | Replacement Mode | And more

 

/**
* @intent Inventory of all devices in FN
* @description Take inventory and then cross check them against the CSV Files for SW/HW info
**/

import "Home/Network Standards/01_NetworkVars/99_Support_Detailed";
foreach device in network.devices
let platform = device.platform
let snapshotInfo = device.snapshotInfo

foreach entry in deviceSupportDetail
where entry.Model == platform.model
let LDOS = entry.HW_LDoS
let STATUS = entry.LCM
let comma = ", "
let ApprovedSW = entry.SW_N + comma + entry.SW_N1 + comma + entry.SW_N2
let Replacement = entry.HW_Replacement
let Unsupported = !matches(device.platform.osVersion,entry.SW_N)
let Current = platform.osVersion

select {
Vendor: platform.vendor,
Name: device.name,
Model: platform.model,
"Model LDoS": LDOS,
"LCM Status": STATUS,
"Current OS": Current,
"Approved OS": ApprovedSW,
"OS Compliance": (if Current == entry.SW_N
then "good"
else if Current == entry.SW_N1
then "good"
else if Current == entry.SW_N2
then "good"
else "fail"),
"Replacement Model": Replacement,
Tags: device.tagNames,
"Collection-Status": when device.snapshotInfo.result is
collectionFailed(collectionError) ->
"Collecting - " + toString(collectionError);
completed -> "Collected";
processingFailed(processingError) ->
"Processing - " + toString(processingError),
"Device Location": device.locationName,
"Management IP(s)": platform.managementIps,
}

 

If anyone can optimize, great! I’m still learning, so any input is appreciated. For information/help - rcariddi@gmail.com

Thank you,
Rich


Hi @cariddir . Your query looks great! 

I only have two minor things to suggest.

First, in the definition of ApprovedSW, you can use a handy built-in function called join(separator, list) to comma-separate the items. Specifically, you can replace that line with this:

let ApprovedSW = join(", ", "entry.SW_N, entry.SW_N1, entry.SW_N2])

Second, in the definition of “OS Compliance” column, you can avoid the series of if-else statements by using the “in” membership test. Specifically, you could write that column like this:

  "OS Compliance": if Current in Centry.SW_N, entry.SW_N1, entry.SW_N2]
then "good"
else "fail",

Finally, I can imagine that it might be hard to keep the LDoS data you have up-to-date and to ensure its quality. You could consider writing one or more queries that run some quality checks on them. For example, one that comes to mind is to find models of devices in your network where the model is not found in your LDoS dataset. You could do that with this query:

supportedModels =
foreach entry in deviceSupportDetail
select distinct entry.Model;

foreach device in network.devices
let platform = device.platform
where isPresent(platform.model) && platform.model not in supportedModels
select distinct { Vendor: platform.vendor, Model: platform.model }

Then you can review this list and see if it is as you’d expect. That is, these are unsupported devices. 

You could go a step further and add these models to your LDoS dataset or mark them as “known-unsupported” models, so that the query only outputs rows for models that are unexpected to you. In other words you could do something like:

knownUnsupportedModels = U"WS-C3650-48TS-L", "ASR1002-HX"];

supportedModels =
foreach entry in deviceSupportDetail
select distinct entry.Model;

foreach device in network.devices
let platform = device.platform
where isPresent(platform.model)
&& platform.model not in supportedModels
&& platform.model not in knownUnsupportedModels
select distinct {
Vendor: platform.vendor,
Model: platform.model
}

Since that would only output rows if your dataset is incomplete or bad, you could enable this as an NQE Verification and have the platform notify you when it outputs any rows.


Andreas,

Thank you for the input! I’ve Implemented both suggestions, and this is GREAT. Yes, it was really painful to keep track of that original file and find devices that were not listed in the LDoS data. I would have to do 2 separate queries, export and ‘xlookup’ the heck out of it, in Excel to find the discrepancies, now I can track MUCH easier.

Sincere thanks!
Rich


@cariddir - This post is referenced in pretty much all of my conversations as an example of extending the data lake of information to parse and analyze.  Thank you for the post.  Very valuable.

The method you kindly explained is a multi-step process of:
1. Obtain the data set in CSV or JSON format.
2. Copy and Paste it into an NQE which is a data set to be referenced elsewhere.  Reference the export deviceSupportDetail = in the NQE so that it can be imported and referenced. 
3. Publish the NQE so that others may be able to reference the NQE data set.
4. Reference the NQE data set as an import. 

The publishing step does provide revision control.  And it is very straightforward process.  

People usually ask as a follow up, “how can we make this more dynamic”?  Introducing Data Connectors.

Data connectors integrate and augment network data from sources external to the network itself. They enable Forward Enterprise to retrieve, parse, and structure data using HTTP(S) GET requests. Data connectors can enhance visibility, improve analytics, and integrate external data sources into Forward Enterprise.


Example JSON output from the Forward Networks Data Connector Documentation.

 

Let's say that one wants to import a JSON data file rather than pulling it directly from an application.  Maybe this is due to two-factor authentication requirements, application access restrictions, perhaps the Collector is not able to establish the connection to the Web site, or there is no specific URL to ingest the data.  

Proposed Idea:

- An application periodically exports data to be imported into a JSON file.  (If the output is in CSV format, use BASH or other scripting language to convert it to JSON).

- The file is copied to, or created in, a directory that is published via HTTP(s) that requires a User ID and Password. 

- A Data Connector is created to import the data file during Collection.

- The data set is automatically added to Forward Networks during Collection process that can then be referenced in an NQE report or Decorator.  

----
The Data Connector name that one designates during creation is dynamically added to the NQE model; dataConnector.<name>.
 

 

Thus one could extend the Forward Networks data model to an application if the output is in JSON, a published file, or other method that one can creatively imagine. 


Your thoughts?  

Thanks for a great post and creativity.  


Reply