Monday, September 28, 2020

Fun with Microsoft Power BI - Part I - Intro

 By Tony Lee

If you have read some of our other articles you can probably tell by now that we enjoy making data actionable. Honestly, it doesn't matter what type of data or even where the data ends up. As long as we can make informed decisions using the data -- we love it. Following in this theme we are going to make BlackBerry (formerly known as Cylance) Protect Threat Data Report (TDR) CSVs actionable using Power BI and Power BI Desktop. Sure, we could have used excel and some charts here and there, but Power BI is a more suitable fit to creating reusable, decision maker ready, reports. You can use any data source to follow along in this series, but our example BlackBerry Protect report is shown below which we will happily share the Power BI (.pbix) file at the end of the series for you to load and analyze your own data, so stay tuned for that!

Figure 1:  Our Power BI report using BlackBerry Protect TDR data

In this first article, we will cover:

  • Getting Started
  • Data Ingest
  • Adjusting Fields
  • Visualizations
  • Saving Your Work

Getting Started

There are many options for using Microsoft's Power BI which are associated with varying costs and features.  As a high-level overview:

  • Power BI Desktop - Free thick client which can be used to ingest data and design reports
  • Power BI Pro - $9.99 monthly per user pricing (included in E5 license)
  • Power BI Premium - $4,995 monthly pricing - Enterprise BI big data analytics
  • Power BI Mobile - iOS, Android, HoloLens, PC, Mobile Device, Hub apps
  • Power BI Embedded - Analytics and visualizations tailored for embedded applications
  • Power BI Report Server - On-premises reporting solution, included in premium and can provide hybrid on-prem and cloud capabilities


For our learning purposes we used Power BI Desktop to develop our report and Power BI ( to display it (full screen) in our private workspace. We appear to be using a "free" cloud account and did not upgrade to Pro.

Note:  You cannot use a personal account to sign into Power BI.  You must use a work or school account.  Chances are you probably have one of these accounts and it has some (even free) access to Power BI.

Data Ingest

Now that you have downloaded Power BI Desktop, we need to ingest data. As mentioned at the start of the article, we are using BlackBerry Protect TDR data which is downloaded from the BlackBerry/Cylance portal in a CSV format. Once the data set is downloaded, in Power BI Desktop, click Home > Get Data > Text/CSV and navigate to the file.

Figure 2:  Many options for loading data

Power BI Desktop did a great job parsing the data in columns with the appropriate header. It even tries to detect the type of data such as string vs. number vs. date.

Figure 3:  Parsing of fields

Adjusting Fields

You should now see the fields on the right hand side of the canvas. Note that there may be some instances in which Power BI takes the liberty in summarizing your data--sometimes this is helpful and something it does not make sense to humans. This is understandable since it still takes a human to determine the context around various data fields. A good example is provided when Power BI Desktop tried to sum the BlackBerry/Cylance Protect scores which is of no real value to analysts.  "A" for effort though and at least it is correctable by clicking on the parsed field on the right > Column tools > Summarization > Don't summarize. 

Figure 4:  Adjusting the parsed fields

Don't worry about trying to find all of the misinterpreted data up front. You will discover some of these as we start creating visualizations in our report.

Note:  Power BI prefers Columnar data, thus spreadsheets that are appealing to the human eye are not always interpreted correctly by Power BI. This level of transforming and manipulating will be left to another article.


What may be most impressive about Power BI is the amount of visualizations available by default. These include (but are not limited to):

  • Area charts
  • Bar and column charts
  • Cards (numeric value)
  • Combo charts
  • Doughnut charts
  • Funnel charts
  • Guage charts
  • Key influencers chart
  • KPIs
  • Line charts
  • Maps (ArcGIS, filled choropleth, and shape)
  • Pie charts
  • Ribbon charts
  • Treemaps
Figure 5:  Visualization options

And the list goes on...  

To start with a simple visualization, let's create a card with the total number of events (in this case it will correlate to the number of rows we have... each row in our data always contains a DeviceName). Begin by clicking anywhere in the canvas and then click the card icon under visualizations. Drag the field you want to count to the Fields box under Visualizations (in our case it was DeviceName). Then click the down arrow and select Count. There are lots of formatting options by clicking on the paint roller under visualizations. We encourage you to explore those settings to achieve your desired look. 

Figure 6:  Created our first visualization - a card that contains the count of events

Saving Your Work

Now that you have ingested, parsed, and created your first visualization in your report, it is time to save it. Click File > Save As > Name the file. Notice that the file extension is .pbix. Feel free to close Power BI Desktop, re-open your file, and also notice that the data is still there. This indicates that the data is self-contained within the .pbix file -- keep this in mind where sharing your .pbix files with others.


In this article, we showed the different options for downloading and using Power BI. Specifically downloading Power BI Desktop and ingesting BlackBerry (Cylance) Protect Threat Data Report data which is in CSV format. Power BI Desktop parsed the data and we showed one potential alteration to the way the data was interpreted. Lastly, we rounded out the article showing how to create your first visualization and save your report locally as a .pbix file. Our follow-on articles will cover more advanced visualizations, relationships, filters, using reports, and uploading reports to Power BI Service (online). Thanks for reading, we hope you enjoyed this introduction to Microsoft's Power BI. Please feel free to leave feedback and your favorite Power BI features in the comments section below.

Tuesday, September 1, 2020

Testing Logstash Data Ingest

 By Tony Lee

When setting up an Elasticsearch Logstash Kibana (ELK) stack, you may discover the need to test your Logstash filters. There are probably many ways to accomplish this task, but we will cover some techniques and potential pitfalls in this article. 

Note: This article can be used to simulate any syslog replay for manual data ingest testing, but our example specifically covers Logstash as the receiver and a fabricated, but realistic, event.

Real-time Visibility

The first thing that will aid us in this ingest testing is real-time feedback. This could come from at least two potential places:

1)  "Real-time" discovery search within the Kibana UI

This is accomplished by using the discovery dashboard, setting a search term you know to be in your test message, and a short refresh rate as shown in the screenshot below.

Figure 1:  Kibana "Real-time" search using refresh interval

2) Monitoring the logstash log file for warnings or errors

For this, we will use the tail command with the follow option (-f) to watch it in real-time.  The location of your Logstash log may differ so adjust the path as necessary.

tail -f /var/log/logstash/logstash-plain.log

We are looking for clues that may have prevented proper ingest such as:

[2020-08-31T20:25:21,997][WARN ][logstash.filters.mutate][main][b14e-snip-a422] Exception caught while applying mutate filter {:exception=>"Could not set field 'name' on object 'compute-this' to value 'see-this'.This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string"}

Encrypted or Unencrypted Listener

Once we have real-time visibility in place, we need to determine if the listening port is expecting encrypted data since this will determine how we replay traffic to it. There are a few ways to determine this:

1) Check the logstash filter config file

The example below, is an example of an encrypted port.  We can see that is the case because we are defining the SSL information and have ssl_enable set to true.

input {

  tcp {

    port => 6514

    ssl_enable => true

    ssl_cert => "/etc/logstash/logstash.crt"

    ssl_key => "/etc/logstash/logstash.key"

    ssl_verify => false



2) Check the logstash logs for SSL errors

If you have an encrypted listener and you send data using an unencrypted transport method (such as telnet), you will see SSL errors such as the following:

[2020-09-01T13:46:20,758][ERROR][logstash.inputs.tcp][main][359d9-snip-c5038d] Error in Netty pipeline: io.netty.handler.codec.DecoderException: error:100000f7:SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER

3) Use a tool (such as openssl) to verify the SSL connection

Below is an example of checking the SSL connection using openssl, but other tools can be used.

openssl s_client -connect <logstash_host>:6514

If the listener is expected encrypted data, you will see details such as certificate subject, issuer, cipher suites and more.


subject=C = AU, ST = Some-State, O = Internet Widgits Pty Ltd

issuer=C = AU, ST = Some-State, O = Internet Widgits Pty Ltd


SSL handshake has read 1392 bytes and written 437 bytes

Verification error: self signed certificate


Methods of Replay

Now that we have real-time visibility and we know if the listener is expecting encrypted data or not, we can look at different techniques to replay the traffic. We will start with unencrypted methods first because we can later tunnel the unencrypted data to an encrypted listener. We will also examine replaying an entire packet (including the header) vs. replaying just the data and having the header added.

1) Unencrypted replay of an exact packet (Included specified time and no addition of a header)

If you have example logs that you can expect that include the date/time format, you can replay the exact message using netcat/ncat.  Keep in mind that you are most likely sending a static time, so you will need set your Kibana time range appropriately.

First, place the contents of your event in a test file, we created a file called testevent.txt with the following contents (notice the included date and time):

<46>1 2020-08-31T02:08:08.988000Z sysloghost ExtraField - - [Location] Event Type: OurTest, Event Name: Blocked, Device Name: DESKTOPTESTLOGGER2, File Path: C:\Windows\system32\foo.ps1,  Interpreter: Powershell, Interpreter Version: 10.0.18362.1 (WinBuild.160101.0800), User Name: SYSTEM, Device Id: 4eaf3350a984

Second, use netcat or ncat to send the data to your listening port. The example shown below is sending to an unencrypted listener (be sure to replace logstash_host with the IP or hostname of your logstash server):

ncat <logstash_host> 6514 < testevent.txt

Then just monitor the logstash log and real-time search in Kibana to see the event and/or potential errors.

2) Encrypted replay of an exact packet (Included specified time and no addition of a header)

For this we will use the same testevent.txt file from above and nearly the same command, but we will add --ssl to force ncat to perform the encrypted handshake.

ncat --ssl <logstash_host> 6514 < testevent.txt

3) Unencrypted replay of an exact packet contents with an updated time

If you have packet contents, but want the header updated with the current time, you might be able to use the logger command in Linux.  The trick here is to get logger to reproduce your expected header. Use the following steps to attempt this:

Understand the logger options:

logger --help

Read in our test event and output to stderr for troubleshooting:

logger -s -f testevent.txt

Use logger options to alter the header (in our case, it was --rfc5424=notq) to match what we need and then create a new file with only the content and no header.  Ex:  testevent_noheader.txt

Figure 2:  Reproducing the event with an updated header

Send the event to the unencrypted listener and check for it in Kibana and Logstash logs:

logger --rfc5424=notq -s -f testevent_noheader.txt --server <logstash_host> --tcp --port 6515

4) Encrypted replay of an exact packet contents with an updated time

Unfortunately our version of logger does not have an option to enable encryption. So, if you were able to get logger to reproduce the header + content in the step above, but need to send it to an encrypted listener, you could once again use ncat to assist. The following command creates an unencrypted listener on your local host on port 6515--then anything written to that local port will be sent on in an encrypted state to port 6514.

Figure 3:  ncat listener to send data onto Logstash using SSL

Step 1) Create the listening wrapper:

ncat -l 6515 -c "ncat --ssl <logstash_host> 6514"

Step 2)  Send the packet to the wrapper using logger:

logger --rfc5424=notq -s -f testevent_noheader.txt --server localhost --tcp --port 6515


We are just scratching the surface in ways to test data ingest components such as Logstash. For instance, this could be expanded to include scripting with variables that are replaced with random data to generate more robust values. But that will be an exercise left to the user (or maybe a future article). We do hope this article proved useful and would love to know what you use for testing data ingest. Feel free to leave comments in the section below.  Thanks for reading.

Thursday, July 23, 2020

Fun with AWS CLI - Cost Explorer (ce) API

By Tony Lee

If you are an avid Amazon Web Services (AWS) consumer and have been thinking about ways to integrate these services into third party-tools (such as CyBot -, you have come to the right place. In this article we will cover how to setup and use the aws2 client (AWS CLI) to easily interact with AWS' APIs.  Per Amazon:  "AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell."  Our example below will illustrate just one possibility by using the Cost Explorer (ce) API determine your current and past AWS bills via the command line or third-party tool.

In other words, we are ditching the Web interface shown below:

Figure 1:  AWS Billing Web Interface

In favor of querying the information via the command line:

Figure 2:  AWS Billing query via command line

Or better yet, a ubiquitous tool such as your favorite chat application using CyBot:

Figure 3:  Enabling CyBot to query the AWS API
We will cover the following tasks to enable this integration:

  • Configuring AWS Permissions
    • Policy creation
    • Group creation
    • User creation
  • Installing the aws2 client
  • Configuring the aws2 client
  • Using the aws2 client

Cost Estimation Requirements

Before we dive into the configuration steps, there are two main requirements:
  • Access to the following endpoint:
  • aws2 client
    • A while back, we tried version 1 of the aws client, but it lacked the cost estimator (ce) command, thus we had to upgrade to version 2 (both referenced via the aws command)

Configuring AWS Permissions

Note:  This step generates your AWS Access Key ID and AWS Secret Access Key.  Copy these down in a safe place because you will need this info in the AWS client setup section

First create and retrieve your AWS credentials via IAM:

Policy Creation
Once, authenticated, create a policy to allow only cost estimation queries (following the principle of least privilege it may be possible to restrict this even more, but that is an exercise left up to the reader):

  1. Click "Customer Managed Policies"
  2. Create policy button
  3. JSON tab, and copy and paste the following:

"Version": "2012-10-17",
    "Statement": [
            "Effect": "Allow",
            "Action": [
            "Resource": [

Figure 4:  Policy Creation via JSON

Group Creation
Now that the policy exists, create a Cost Estimator Group and assign it the newly created policy by performing the following steps:

  1. Click "Groups"
  2. Click Create New Group Button
  3. Name the group: CostEstimatorAPI
  4. Attach a policy:  CostExplorerPolicy
  5. Next
  6. Apply

User Creation
Now that the group exists, create a user and assign it to the newly created group by performing the following steps:

  1. Click "Users"
  2. Add user button
  3. Name the user
  4. Access type:  Programmatic access
  5. Select the CostEstimatorAPI Group we created
  6. Click next button
  7. Click Create user button

Figure 5:  User is created and tied to the group that has cost estimator permissions

Installing the aws2 Client

We installed the aws client on an Ubuntu Linux VM.  If you are using a different operating system, feel free to follow the instructions contained in the following documentation:

curl "" -o ""


sudo ./aws/install

Now check the version, you might see something like the
/usr/local/bin/aws --version

You might see something like the following:
aws-cli/2.0.33 Python/3.7.3 Linux/4.4.0-31-generic botocore/2.0.0dev37

Note:  This needs to be version 2.x or higher for the cost explorer functionality to exist.

Configuring the aws2 Client

Note:  Have your AWS Access Key ID and AWS Secret Access Key ready from the Setup AWS Permissions section

aws configure

AWS Access Key ID [None]: AK[redacted]YY
AWS Secret Access Key [None]: vn[redacted][redacted][redacted]Rm
Default region name [None]: 
Default output format [None]: 

Using the aws2 Client

When using the aws client, you can use the following syntax to get help in trying to figure out how to achieve your goals.

  aws help
  aws <command> help
  aws <command> <subcommand> help

Be sure to also consult the docs for syntax help and example usage.  In our case, we are looking at cost explorer functionality:

Specifically for our example, we will look to retrieve the cost and usage information:

After a bit of trial an error to get the exact format of required parameters, we have the following:

aws ce get-cost-and-usage --time-period Start=2020-07-01,End=2020-08-01 --granularity MONTHLY --metrics "BlendedCost" "UnblendedCost" "UsageQuantity"

Which yields the following output:

    "ResultsByTime": [
            "TimePeriod": {
                "Start": "2020-07-01",
                "End": "2020-08-01"
            "Total": {
                "BlendedCost": {
                    "Amount": "xx.6272352241",
                    "Unit": "USD"
                "UnblendedCost": {
                    "Amount": "xx.6272352241",
                    "Unit": "USD"
                "UsageQuantity": {
                    "Amount": "xxx.1172177568",
                    "Unit": "N/A"
            "Groups": [],
            "Estimated": true

Two things to note about the time period parameter (--time-period Start=2020-07-01,End=2020-08-01)
  1. It would be ideal if the aws client accepted terms such as:  "This year", "this month", "this week", or "today", but now that this is integrated into our third party tool, we can create that logic.
  2. But at least the client allows you to select a date in the future (ex:  The first of next month) for calculating the present time period.


We hope this article serves as a quick introduction and jumpstart to your interactions with the AWS API using the freely available client. While not perfect, this client is quite powerful and can be used to quickly interact with the Amazon Web Services API to build integrations into third-party tools.  Feel free to leave any comments in the section below.  Happy Hacking.


These are references which we found to be useful:

Thursday, April 23, 2020

Easy Cord Cutting Guide - Zero to Hero

By Tony Lee

In my house, we don't watch a lot of traditional cable TV, so we ditched cable years ago and instead increased our Internet bandwidth so we could stream more content such as Amazon Prime, Hulu, Netflix, and now Disney+. Additionally we purchased a couple of inexpensive and easy to install over the air (OTA) antennas for two TVs so we can get local news + Good Morning America, Wheel of Fortune and Jeopardy! This setup which provided around 18 channels total worked great until Covid-19 hit and the family started watching a little more TV. I figured now is a good time to level up our cord cutting game and reduce some house hold stress (everyone else's stress--not mine... ha!). This guide is intended to give you the quick basics to get you up and running fast using freely available over the air TV signal.

Topics covered:

  • Checking Channel Availability
  • OTA Tuners
  • Antenna Choice
  • Antenna Direction
  • Signal Distribution
  • OTA Digital Video Recorder (DVR)

Checking Channel Availability

Before you even spend one penny, you can check to see what sort of channels you can expect to easily receive by using one of a handful of free websites. Two are shown below:

After entering your address, the map shows the transmitters relative to your house and in which direction to expect the signal.  Figure out which channels you care about and target those transmitters first.

Figure 1:  Map showing transmitters relative to my house

The same site will also list the number of the channel, the affiliate (name you know), and the distance as shown in Figure 2 below.  Depending on the terrain, you typically have a very good chance of picking up channels within 20 - 25 miles. For transmitters that are 35 - 50 miles away, that can be a little difficult and the signal can cut in and out on smaller antennas that are not attic or roof mounted.  Anything above 50 miles can be very difficult to receive without proper conditions and the antenna as high as possible.

Figure 2:  Channel line up and the likelihood of receiving the channels

OTA Tuners

Once you have determined the stations you want and the estimated feasibility, now you need to make sure your TV has an OTA tuner. The good news is that most flat panel TVs come with built in TV tuners that can use to find channels when hooked up to an OTA antenna. Each TV will vary in how to accomplish this, but generally you want to go to TV input > Antenna > Scan for channels. If you have an older TV that does not have a built in tuner, you can purchase a tuner for very little money, but check your TV first since it is probably already built-in. Tuner quality can make a difference, but in most cases it will be minor. A sample tuner for $30 is shown below if your TV does not already have one:

Figure 3:  Sample TV Tuner

Antenna Choice

Now that we confirmed we have an OTA tuner, there are a ton of antennas to choose from and we could write a novel just covering antenna theory, design, and options, etc. But we want to give you the basics so we will cover three of the best options here:

1)  Flat antenna - We received a max of 18 channels with this antenna
Good for one TV tuner placed on an outer wall or window in the general direction of the towers--this is fairly discrete and the easiest and cheapest way to get started and test the waters with minor risk.  Sometimes you can even hide these antennas on the wall behind the TV with the double sided tape provided. We used this for years to get the local channels on a couple of TVs and it worked great. But when we started consuming a bit more TV content, we moved to the larger antennas mentioned below.

Note: This antenna comes with an inline amplifier. Try it first without the amplifier and then try it with it the amplifier. Each time you change direction or configuration of the antenna you need to rescan for channels. If you are already close to the towers, the amplifier can work against you by overpowering the signal. If you are farther from towers, this may be beneficial to use it.

Subnote:  This Antenna claims it has a 120 mile range, but that is quite the over exaggeration. The planets would probably need to align for that to happen.  Meh.

Figure 4:  Basic flat antenna that you can stick to the wall or window

Subsubnote:  We also tried a TV mounted bar antenna on an inner room without access to an outer wall and the experience was not as good, but might be worth trying. It just depends on your conditions and proximity to the transmitters.  An example is shown below:

2)  Inexpensive Small-ish Yagi Antenna - We received a max of 29 channels with this antenna
Now we will move into an antenna category that is probably too large to sit in your living room, but it should produce more channels by receiving signals from farther away. At around $50 regular price, this RCA yagi ( is worth the $20 upgrade over the flat antenna, but now you need to attic, ground pole, or roof mount it and figure out where to run the coax (we ran a new coax cable to a wiring panel). We placed this antenna in our attic right on top of the insulation since it was so light weight. It claims to be a 70 mile antenna and that could be the case given perfect conditions.  There is very little assembly required, but the instructions are absolutely terrible, so you may need to read the instructions a million times to get it right...

Figure 5:  RCA Yagi Antenna

Figure 6:  The RCA yagi is so light weight it just sat on top of the insulation

3)  More expensive larger Antenna - We received a max of 43 channels with this antenna!!
If the RCA yagi did not get you exactly what you are seeking you can try to spend a tad more money than the yagi and go for a more capable antenna such as Antennas Direct Clearstream 4 ( This also claims a 70 mile range which is quite possible given excellent conditions. This antenna impressed us right away with the easy assembly, increased amount of channels, and very strong signal.  The signals that came in weak for the RCA yagi came in great with this one. If you have $100 to spend on an antenna, this is worth it. This antenna not only picked up CBS, NBC, ABC, and FOX from 20 miles away, it also picked up PBS which we were not even receiving with the RCA yagi!

Figure 7:  Antennas Direct Clearstream 4

Figure 8:  Pole mounted the Clearstream due to the weight and inability to sit upright with the insulation

Antenna Direction

Before you climb into the attic to mount the antenna, make sure you download an antenna app or two to tell you the rough direction in which to point the antenna. We used two apps to spot check each other.

RCA Signal Finder app:

Digital TV Antennas app:

Both of these apps use GPS and compass capabilities in your phone to point you in the direction of the transmitter towers. So, we suggest you use multiple apps to check each other and this will get you very close. You may need someone standing at a TV confirming the signal quality though. One of our TVs (TCL 49S325) actually has a signal meter which was handy for this part.

Figure 9:  Using a phone app in the attic to determine the direction that the antenna needs to point

Signal Distribution

If you are using an attic, pole, or roof mount antenna, you probably want to run this signal to multiple TVs which requires splitting the signal. You could use a 2 or 3-way splitter to send it to a couple of TVs or your house may have a distribution panel which allows you to send it to all coax wall jacks. Either way, make sure you select the proper hardware below.

Passive Splitters:
For around $10 or less, you could try a passive splitter to send the signal to multiple TVs, however these introduce loss which matter far more in receiving OTA signal than through conventional cable TV.  Each split in the line can reduce the number of channels you can find and the overall signal quality. These are not really recommended--especially when chained, however if you need to split signal, try to get higher quality splitters with decent reviews. Two examples can be found below:

2-way splitter:

3-way splitter:

"Lossless" distribution amplifier:
Signal boosters that also split the signal are typically more expensive ($40+) than passive splitters, however they should perform better--especially when splitting the signal to a high number of TVs. There are a fair amount of options here, but the most popular seem to be Leviton, Channel Master, or Antronix.

Figure 10:  Channel Master distribution amplifier

Figure 11:  Picture of my wiring "closet" - Distribution box into some splitters - needs some clean up.  :-|

8 output Channel Master Distribution Amplifier:

9 Port Antronix Bi-Directional Cable TV Splitter Signal Booster:

OTA Digital Video Recorder (DVR)

The last step to complete the premium OTA experience is to be able to provide a TV guide (some, but not all, TVs can generate this for you) and the ability to record, pause, and fast forward your favorite shows. For this, you will need an OTA DVR which can be the most expensive component of cutting the cord, but it may also be the step that makes it most palatable. There are a number of options listed below, but we chose the Amazon Fire TV Recast since we are already a household driven by Amazon and Alexa. This enables us to record 2 channels at the same time and watch 2 channels at the same time through the Fire sticks and any echo show device. The UI can be a little clunky at first, but once you get used to it, the experience is quite nice. My only wish is that it could stream to more than 2 devices at a time. If we need to do so, we just use the TVs tuner to watch live TV, but we don't get to fast forward or pause when we use the TVs tuner directly.  #FirstWorldProblems - let's keep it in perspective... Some other options are below, but note that some of them do not come with a hard drive and that is an additional cost.

Figure 12:  Fire TV Recast

Fire TV Recast ($194 on sale and includes the hard drive):

Tablo Quad OTA DVR ($199 and still needs a hard drive):

TiVo Bolt ($199 and includes the hard drive):


This article outlines the various levels of cord cutting OTA heroism which is summarized below. You can spend as little as $30 for a decent experience or you can spend several hundred dollars for an experience so good--most do not know it is OTA. Either way, you will recover your investment over the long run and regardless of the reason for cutting the cord, you have plenty of options to still receive useful content for a reasonable time and money investment. Please feel free to leave feedback in the comments section.  Enjoy!

Level 1: - ~$30 - $60 total
 - Confirming channels - Free
 - Confirming tuner availability - Free or $30 max
 - Adding a flat antenna for a single TV - $30

Level 2: - ~$160
 - First two in level 1
 - Antenna is mounted on roof, attic, or pole in the proper direction - $100
 - Signal is distributed to multiple TV sets or the whole house via a powered distributor - $55

Level 3: Cost of Level 1 or 2 + $200+
 - All or most of level 2
 - OTA Digital Video Recorder

Figure 13: Progression of OTA super hero status

Tuesday, April 7, 2020

Battle Covid-19 Using 3D Printing

By Tony Lee

This is a bit of a different topic than we usually cover on this blog, but it is difficult to ignore the current situation regarding the Coronavirus. That said, many of my friends and family are putting in a valiant effort sewing masks to give away to doctors, nurses, family, friends, and even strangers on Facebook--hopefully these masks are a last resort for some previously mentioned but we might as well be prepared for the worst. That had me thinking as a non-sewing individual, how can I contribute? That's when I saw folks 3D printing parts for face shields and these things called "Ear Savers" which help prevent skin irritation behind the ear caused by friction from elastic bands found on common protective masks.

Figure 1:  Ear saver shown from:
Prior to printing anything of actual importance my 3D printer's main duty was to print Play-Doh molds and toys for my kids. It has now been reassigned to a greater purpose and we are using this article to share some tips and tricks with others that may also want to contribute to battling Covid-19 using 3D printing. You can get up and running with the basics for a couple hundred dollars or less--the rest is convenience and efficiency.


3D Printer
The 3D printer I purchased two years ago is a Comgrow Creality Ender 3 ( It is a very capable printer that can print larger items, but it is sort of a kit that you put together (a task not for the faint of heart). At the time it was regularly priced at $230, but I bought it on an Amazon lightning deal for $180 as a STEM project that I could enjoy with my kids. This is just an example of what I use--but the exact printer does not matter as much for this purpose. 3D printers vary widely in price and capability and some printers even arrive pre-built and ready to go.  ;-)

3D Printing Filament
This is the material that is melted by the heat from the print nozzle and then reformed into whatever you are printing. Polylactic Acid (commonly called PLA) seems to be most commonly used material and it is derived from renewable resources like corn starch or sugar cane instead of petroleum.  Two examples of PLA that I have purchased are shown below.

White PLA large spool - $21

Multicolor PLA 4 smaller spools - $23

Figure 2:  The 3D printer doing its thing!

3D Print Designs and Modification

Getting Ideas
If you are new to 3D printing, you may want to start off by using someone else's initial design and then possibly modifying it (if necessary).  A great place to get started with freely shared designs is Thingiverse (  This is where I download .STL files and then convert them to .gcode -- which is essentially the three dimensional coordinates in space that the printer uses to know where to place the print nozzle. If you are a multivariable calculus nerd, you should naturally love 3D printing--but it is not needed to enjoy this hobby.

Simple Modifications
If you need to make some modifications to the design or even create your own from scratch, you can use the included slicer software or there are some really great free resources on-line. My Ender 3 came with Ultimaker Cura slicer software, but I don't really use it other than to convert the .STL files to .gcode.  Instead, I use a free site called Tinkercad ( which is made freely available by Autodesk. This software is amazingly powerful for editing .STL files from Thingiverse or other sites, but it can also be used to create your own 3D designs. As a final step, I still slice the .STL to .gcode using the included Ultimaker Cura software, but you could also try using the freely available ideaMaker slider from

Figure 3:  Modifying the design and quantity using Tinkercad

Monitoring the Print Job

There are many fancy ways to monitor a print job--some of which include customized firmware and Raspberry Pis, but I took a different approach. I purchased a 1080p Wyze camera ( on sale from Amazon for around $20. I then play the video through my echo show ("Alexa, show me the Ender Camera") and watch for the print job to complete. The added bonus of using the Wyze camera is that it has night vision so I can turn the lights off in the garage but still monitor the print with excellent clarity. I can also ask Alexa to set a timer for the print job ("Alexa set a timer for x minutes") so I know roughly when to peel off the completed work and start a new one. A process that only takes a minute--the printer is doing the real work, but it doesn't mind.

Figure 4:  Monitoring the 3D print job using the Wyze camera through the Amazon Echo Show

Design Evaluation

The best thing about being able to 3D print something is instant gratification AND the ability to quickly evaluate the efficacy of an idea without a significant spend (time and money). In the case of the ear savers, we evaluated four different designs found on Thingiverse and tried them with the masks that my family is producing. Then we were able to stack rank them based on comfort and then start mass producing the most comfortable design. Of course, we appreciate everyone's contributions to Thingiverse, but some designs are work better for certain projects or are easier to modify when needed.

For example, the graciously provided ear saver designs we evaluated were:

Figure 5:  Evaluating the ear savers and ranking them in terms of comfort from top to bottom (personal preference)
This quick evaluation allowed us select the best option and now we include ear savers with every mask we deliver. When ear saver production out paces the masks, we send those out where there is a need.


At some point, we may post follow up articles that cover more of the intricacies of 3D printing, but we wanted to share this information with others who might also be looking for ways to contribute to the battle against Covid-19. We hope you enjoyed the article and encourage respectful and helpful feedback in the comment section below. This is by no means the only way to 3D print, but it should help some get started 3D printing to combat Covid-19. It is quite an amazing experience to see or hear about a person's reaction when you freely give them something you created to protect their health--sometimes they cry. There may be no way to describe this feeling in words, but it certainly makes the time and effort worth every second and every penny. Please stay safe and happy 3D printing.

Tuesday, February 25, 2020

Advanced Ticketing Analytics Using Your SIEM

By Tony Lee

The Problem with Ticketing Systems

Regardless of the vendor, ticketing systems are a double-edged sword. At the extremes they have immense power and utility when used correctly, but they can also overwhelm users and mislead decision makers when used incorrectly. My personal observations indicate that many implementations seem to be stuck in an in between state of partially useful without providing much insight. However, it isn’t the product’s fault per say because anything out of the box cannot efficiently solve all problems. Ticketing systems are multipurpose tools so they need a bit of customization to achieve their full potential. But this customization requires a good (but difficult to find) ticketing system develop, right?  Maybe not!  Take a look at our example below using ServiceNow and Splunk and let us know if you are doing something similar with other platforms.

Figure 1:  Example Splunk dashboard processing and displaying ServiceNow tickets

Possible Solution

All is not lost if you don’t have a good ticketing system developer. If you are sending the ticket data to your SIEM, you can build those dashboards and gain awesome insight within your single pane of glass. Some ideas for useful insights are:

  • Total number of tickets
  • Opened and assigned
  • Unassigned
  • Waiting on third party
  • Ticket priority
  • State of tickets
  • Oldest tickets
  • Most affected user
  • Most affected group/department
  • Responder handling the most tickets
  • Top close code

Bonus Round

In addition to building the statistical visibility above, filters can be quite useful in narrowing down the information you are seeking. This also allows you to pivot to more complex outliers. The following filters have proven quite useful for us:

  • Time range
  • Wild card search
  • Ticket state
  • Ticket substate
  • Assignment group
  • Priority 

Extra Credit

After creating the filters above, we recommend adding the ability to pivot back to the ticketing system to check out ticket details or possibly take action on tickets. Such as the ability to change the status of a ticket with a single click. These work flow efficiencies prevent copy and paste errors and shave off serious time and effort from an already overloaded responder. Our example dashboard contains multiple places where a responder may pivot back.


If you are in need of greater insight into tickets and their status, creating that insight in a SIEM such as Splunk may not be a bad idea. Even if you have a ticketing system developer, sometimes they just need some ideas to get started on dashboard development and this may be just what they need. This article is meant to provide ideas and even a jumpstart if Splunk and ServiceNow are in use in your environment. We hope it saves you some time—feel free to leave feedback in the section below.

Note:  In our ServiceNow / Splunk example, we used the existing Cylance ServiceNow Technology Add-on.  That said, not all fields are always properly parsed – especially if they are longer fields and use characters that may break the parsing.  It will get you 95% of the way there though.

Dashboard Code

The following dashboard assumes that the appropriate logs are being collected and sent to Splunk. Additionally, the dashboard code assumes an index of snow. Feel free to adjust as necessary. Splunk dashboard code provided below:

  <label>ServiceNow Tickets</label>
  <description>Limited to INSERT_YOUR CI Type</description>
  <search id="Base_Search">
    <query>index=snow sourcetype="snow:incident" $wild$ u_ci_autofill=YOUR_AUTOFILL_ID | dedup number | eval DaysOpen=round((now()-strptime(dv_opened_at, "%m-%d-%Y"))/86400,2) | rex field=dv_priority "\d\s-\s(?&lt;dv_priority&gt;.*)" | rex field=dv_severity "\d\s-\s(?&lt;dv_severity&gt;.*)" | table dv_opened_at, DaysOpen, dv_opened_by, dv_closed_at, dv_closed_by, number, dv_incident, dv_state, dv_substate, dv_close_code, dv_priority, dv_severity, dv_u_affected_user, dv_cmdb_ci, dv_malware_url, dv_approval, dv_assigned_to, dv_assignment_group, dv_short_description, close_notes | search $dv_priority$ $assignment_group$ $substate$ $dv_state$</query>
  <fieldset submitButton="false" autoRun="true">
    <input type="time" token="time" searchWhenChanged="true">
      <label>Time Range</label>
    <input type="text" token="wild" searchWhenChanged="true">
      <label>Wildcard Search</label>
    <input type="dropdown" token="dv_state">
      <label>Ticket State</label>
      <choice value="NOT dv_state=Canceled NOT dv_state=Closed NOT dv_state=Resolved">Not Canceled, Closed, Resolved</choice>
      <choice value="*">All</choice>
      <choice value="dv_state=New">New</choice>
      <choice value="dv_state=Analysis">Analysis</choice>
      <choice value="dv_state=Contain">Contain</choice>
      <choice value="dv_state=Cancelled">Cancelled</choice>
      <choice value="dv_state=Closed">Closed</choice>
      <choice value="dv_state=Review">Review</choice>
      <choice value="dv_state=Resolved">Resolved</choice>
      <default>NOT dv_state=Canceled NOT dv_state=Closed NOT dv_state=Resolved</default>
      <initialValue>NOT dv_state=Canceled NOT dv_state=Closed NOT dv_state=Resolved</initialValue>
    <input type="multiselect" token="substate">
      <label>Ticket Substate</label>
      <choice value="*">All</choice>
      <choice value="&quot;Waiting on External&quot;">Waiting on External</choice>
      <choice value="SOC">SOC</choice>
    <input type="multiselect" token="assignment_group">
      <label>Assignment Group</label>
      <choice value="*">All</choice>
      <choice value="&quot;SOC Level 1&quot;">SOC Level 1</choice>
      <choice value="&quot;SOC Level 2&quot;">SOC Level 2</choice>
      <choice value="&quot;SOC Level 3&quot;">SOC Level 3</choice>
      <delimiter> OR </delimiter>
    <input type="multiselect" token="dv_priority">
      <choice value="*">All</choice>
      <choice value="Critical">Critical</choice>
      <choice value="High">High</choice>
      <choice value="Moderate">Moderate</choice>
      <choice value="Low">Low</choice>
      <delimiter> OR dv_priority=</delimiter>
        <title>Total Tickets</title>
        <search base="Base_Search">
          <query>| stats count</query>
        <option name="drilldown">all</option>
        <option name="refresh.display">progressbar</option>
        <title>Open and Assigned</title>
        <search base="Base_Search">
          <query>| search dv_state!=Cancelled dv_state!=Closed NOT dv_assigned_to=""  | stats count</query>
        <option name="drilldown">all</option>
        <title>Open and Not Assigned</title>
        <search base="Base_Search">
          <query>| search dv_state!=Cancelled dv_state!=Closed dv_assigned_to="" | stats count</query>
        <option name="drilldown">all</option>
        <title>Waiting on External</title>
        <search base="Base_Search">
          <query>| where dv_substate="Waiting on External" | stats count</query>
        <option name="drilldown">all</option>
        <search base="Base_Search">
          <query>| top limit=0 dv_priority</query>
        <option name="charting.chart">pie</option>
        <title>Top State</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_state</query>
        <option name="charting.chart">pie</option>
        <title>Top CI</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_cmdb_ci</query>
        <option name="charting.chart">pie</option>
        <title>Longest Open Tickets &gt; 7 days (Click to View Directly in Service Now)</title>
        <search base="Base_Search">
          <query>| stats values(DaysOpen) by number | rename values(DaysOpen) AS DaysOpen | where DaysOpen &gt; 7</query>
        <option name="charting.axisTitleX.visibility">visible</option>
        <option name="charting.axisTitleY.visibility">collapsed</option>
        <option name="charting.axisY.scale">linear</option>
        <option name="charting.chart">bar</option>
        <option name="charting.chart.showDataLabels">all</option>
        <option name="charting.chart.stackMode">default</option>
        <option name="charting.drilldown">all</option>
        <option name="charting.layout.splitSeries">0</option>
        <option name="charting.legend.labelStyle.overflowMode">ellipsisEnd</option>
        <option name="charting.legend.placement">right</option>
          <link target="_blank">$row.number$</link>
        <title>Longest Open Tickets &gt; 7 days (Click to View in Splunk)</title>
        <search base="Base_Search">
          <query>| eval info=dv_assigned_to + " - " + number | stats values(DaysOpen) by info | rename values(DaysOpen) AS DaysOpen | sort - DaysOpen | where DaysOpen &gt; 7</query>
        <option name="charting.axisTitleX.visibility">collapsed</option>
        <option name="charting.axisTitleY.visibility">collapsed</option>
        <option name="charting.axisY.scale">linear</option>
        <option name="charting.chart">bar</option>
        <option name="charting.chart.showDataLabels">all</option>
        <option name="charting.chart.stackMode">default</option>
        <option name="charting.drilldown">all</option>
        <option name="charting.layout.splitSeries">0</option>
        <option name="charting.legend.labelStyle.overflowMode">ellipsisEnd</option>
        <option name="charting.legend.placement">right</option>
        <title>Top dv_u_affected_user</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_u_affected_user</query>
        <option name="drilldown">cell</option>
        <title>Top dv_assigned_to</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_assigned_to</query>
        <option name="drilldown">cell</option>
        <title>Top dv_assignment_group</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_assignment_group</query>
        <option name="drilldown">cell</option>
        <title>Top dv_close_code</title>
        <search base="Base_Search">
          <query>| top limit=0 dv_close_code</query>
        <option name="drilldown">cell</option>
        <title>Details (Click the row to visit ServiceNow directly)</title>
        <search base="Base_Search">
        <option name="dataOverlayMode">none</option>
        <option name="drilldown">cell</option>
        <option name="percentagesRow">false</option>
        <option name="rowNumbers">true</option>
        <option name="totalsRow">false</option>
        <option name="wrap">true</option>
        <format type="color" field="dv_incident">
          <colorPalette type="list">[#65A637,#6DB7C6,#F7BC38,#F58F39,#D93F3C]</colorPalette>
          <scale type="threshold">0,30,70,100</scale>
        <format type="color" field="dv_substate">
          <colorPalette type="map">{"Waiting on External":#6A5C9E}</colorPalette>
        <format type="color" field="dv_priority">
          <colorPalette type="map">{"High":#D93F3C,"Medium":#F7BC38,"Low":#6DB7C6}</colorPalette>
          <link target="_blank">$row.number$</link>