Data ingestion
Feed event and metric data into Moogsoft Cloud
BASIC | 4 MIN
Integrating with Datadog has never been easier! In this video, you will learn how to create an API key and application key in Datadog, create user credentials in Moogsoft, and configure data types and filters in the Datadog integration.
Demo video: Configure an inbound integration with Datadog ►
If you use Datadog to monitor your environment, you want to integrate it with Moogsoft.
By routing your data to Moogsoft, you will enjoy richer context through event correlation, quicker anomaly detection with dynamic thresholding and deduplication. The Moogsoft Incidents can then be sent to Datadog with greater context to accelerate the resolution.
In this video, you will learn to
create an API key and Application key in Datadog
Create user credentials in Moogsoft
Configure data types and filters in the Datadog integration
Datadog integration is available here.
Provide a meaningful name.
If you already have credentials with Datadog, select it from the pulldown. Otherwise, to add a new credential, we need to grab an API key and an Application key from Datadog.
Create a new API key...
Go back to Moogsoft, and paste it here.
Also, create an Application key...
Make sure you can tell what this is used for...
Copy the key, not the Key ID.
Paste it here.
Great! Now we have a valid credential to select.
By default, we will be collecting both events and metrics from Datadog, but that’s configurable.
Best practice here is to filter out low priority data.
Ingesting the data that is most important to you helps to avoid any Datadog rate-limiting. The sources, tags, metric names, hosts, and tag filters are pulled from Datadog automatically. This increases the context when issues occur, letting you focus on the root cause analysis and fix!
Now let’s test. Success! You got a preview of what events and metrics you'd expect to see with the parameters given.
Here’s the request for Datadog Events, which includes the Datadog endpoint and the query parameters we set up.
And we got a response back.
And this is the request for Datadog Metrics, also showing the endpoint.
And the response back confirms the filters that we set up.
Now you know how to set up an inbound integration between Datadog and Moogsoft. To learn more about the benefit of integrating Moogsoft with Datadog, watch this video.
Also, view this video to learn how to get Moogsoft Incidents into Datadog.
Thanks for watching!
BASIC | 8 MIN
Learn how to use the CYOI feature to configure a custom integration. This video covers:
When to use the custom integration option
Ingest data in JSON format into Moogsoft Cloud
Map data from your monitoring software to Moogsoft event fields
Set up and test a deduplication key to reduce operational noise
Demo video: Create a custom integration ►
After watching this video, you will be able to configure a custom integration to bring data from your monitoring source into Moogsoft. You will be able to identify when to choose this over other integration options, ingest data in JSON format into Moogsoft, map data from your monitoring software to Moogsoft event fields, and set up and test a deduplication key to reduce operational noise.
Moogsoft offers a few different ways to ingest source data. You can install a data collector to the system you want to monitor...
...or integrate with specific monitoring tools.
Or you can build your own custom integration. You should choose a custom integration when there is no collector for your environment, and there is no specific integration available for your monitoring tool.
Creating your own integration has several benefits. First, you are not limited to particular monitoring tools. You can set up a custom integration with any monitoring source that can send data in JSON format.
Second, you will use real data to design and test your integration. The Create Your Own Integration feature makes it easy to inspect source data before you map it.
And finally, creating a custom integration is simple. When you configure a webhook in a monitoring tool, you usually have to write code to modify the outgoing data payload. The Create Your Own Integration feature does all the mapping right in the Moogsoft user interface, with no code.
After you have ingested data using a custom integration, you can transform and enrich it using the Moogsoft Workflow Engine. Still without writing any code.
Let’s step through setting up a custom integration together. Let’s say we have source data in JSON format that’s structured like this:
We want to ingest the events so we can deduplicate and correlate them in Moogsoft.
And we need to map them to the Moogsoft event fields.
We also want to map the values for severity. Our source events only have four severity levels: NONE, INFO, WARN, and PROBLEM. We want to map these values to Moogsoft severity levels like this.
This is where you can set up a custom data ingestion. We are going to set up a new integration. Our source data in this example events, but note that we support metrics also.
We just created a custom endpoint for our integration.
Grab that, and the API Key for our Moogsoft instance which is found here.
We’ve input those values in the source system. And now, Moogsoft is receiving raw data from our source. Note that up to 10 of the most recent events are cached. You can click the reload button to refresh the list of data payloads.
We can examine the data for each event here.
Note that some of these events have blank hostnames, and for some the hostnames are populated.
Let’s see how the payload and target fields line up. Moogsoft automatically matched the obvious ones.
We don’t have a value for source, which is a required field. So we need to map something to that field.
Let’s look at the data again. We want to map hostname to source, but we know it’s blank in some cases, and required fields can’t be blank.
Let’s do this. We’ll map hostname to source if it exists, and if not we’ll use ip_address.
We’ll map hostname to source first. Then we’ll add a second mapping, and map ip_address. Moogsoft will use the first mapping if a value exists and is valid. Otherwise, it will move on and use the next mapping.
This defines the condition that generates the event.
We’ll map the Type field to check.
Next let’s take care of the severity. Our monitoring service has only four severity levels: NONE, INFO, WARN, and PROBLEM. So we’ll map them to the Moogsoft severity levels like this...
With that we took care of all the required fields. Let’s see what other fields we might want to bring in. Let’s say we want to keep region. And let’s add a custom tag for ip_address.
The last step before we save our integration is to configure the deduplication key. Here’s how deduplication works. Moogsoft uses the fields in the deduplication key to assign multiple events to the same alert and reduce noise. The idea behind deduplication is that events with the same context should become part of the same alert.
For example, we might find out about a condition that affects one of our hosts with a warning event, which then escalates to a critical event. Since the key context is the same, Moogsoft would assign those events to the same alert.
Here comes another event, and since the dedupe key is different, it is categorized into a separate alert. Like this, selecting the right fields for your use case is the key factor for successful deduplication. The default is set to be the combination of the Source, Service, and Check fields, but you can select what works for you. Just ask yourself, what should be the common factors for two events to be considered duplicates.
Going back to our custom integration, these are the default fields for deduplication. You can change them if you wish, but Moogsoft recommends you use the defaults unless your business needs require that you use different fields.
In our case we don’t have a value for class, so we’ll remove that from the default keys.
Let’s test. Success! Here are the resulting deduplication keys.
Based on these values, some of the events have been assigned to the same alert.
Our custom integration is ready to go. Let’s save and activate it. Note that it will stay in provisioned status until Moogsoft processes an event.
The monitoring data is flowing into Moogsoft through our custom integration. Thanks for watching!
BASIC | 2 MIN
Do you know how to install a Moogsoft collector? If not watch this demo.
Demo video: Install a Moogsoft Cloud Collector ►
Watch how to Install a Moogsoft Cloud Collector.
In this demo, we are going to install the Moogsoft collector to an EC2 instance that's running Docker.
Here's the installation script.
It sets the API key...
And the controller variable...
...then downloads the installer script and plugs in those variables.
There's one more variable to be set, but we have to run this script first.
The install script is downloading now.
Here’s a prompt to set the file location variable. Note that the file location will be different based on whether you are installing the collector as a root user or not.
Saving this is not a requirement. But if you want to start, stop, and reload the collector later, either put this variable in one of your Linux startup config files, or export it like this.
Here’s the collector we just installed.
By default only the System plugin is enabled.
Let’s enable the Docker plugin. You can apply a filter to exclude certain container ID prefixes if you want.
And now we enabled the Docker plugin. Let’s check if the metrics are flowing in. A wide variety of metrics are available. These came from the docker plugin we just enabled,,and they include container, memory cpu, network, and block IO stats.
These came from the system plugin that was enabled by default.
Each of these metrics come with a variety of useful tags that you can use for troubleshooting or correlating alerts when something happens.
Now you know how to install a collector. Thanks for watching!
BASIC | 7 MIN
After watching this video, you will be able to set up a template in AppDynamics to send data to Moogsoft, configure a JSON payload to map AppDynamics data to Moogsoft event fields, and define an AppDynamics policy to forward health rule violations and other issues to Moogsoft.
Demo video: Integrate with AppDynamics ►
After watching this video, you will be able to configure AppDynamics to send events to the Moogsoft API. Specifically, you will know how to:
Set up a template in AppDynamics to send data to Moogsoft
Configure a JSON payload to map AppDynamics data to Moogsoft event fields
Define an AppDynamics policy to forward health rule violations and other issues to Moogsoft
Here’s our scenario. We want to collect application performance data from AppDynamics so we can deduplicate and correlate it in Moogsoft.
You need two pieces of information to set up this integration: the Moogsoft URL and the API key.
Let’s go to your AppDynamics instance. Let’s say from the application you instrumented, we want to collect health and performance anomalies.
We are going to start by defining a new HTTP request template. With this template, we can set up an outbound webhook from AppDynamics to Moogsoft. Name the template...
...and enter the POST method, and the URL from your Moogsoft AppDynamics integration.
We need some custom request headers to accompany our data payload. The content type is application/json. And here is where we enter the apiKey from Moogsoft.
Setting up the data payload is a pretty involved task, so let me park this for now and take care of the rest of the template first.
Here’s where you configure the response settings. Plug in the failure codes here. And the success codes here. A 200 means Moogsoft accepts the data payload.
We’ll leave the rest of the settings the same. So that’s the template. Now let’s drill down into the payload setup section we skipped. Here’s a sample script from the Moogsoft documentation. You can use this to map the AppDynamics data fields to those in Moogsoft.
You do not need to map every single source data field. If the field value is not useful for deduplication, correlation, or troubleshooting, you can leave that field unmapped and drop the data.
Here’s a common use case. Service is an important field, but the source data from AppDynamics doesn’t contain that information. You are planning to get that information from a CMDB. What you can do in such a case, is to provide a default placeholder text. Then set up an event enrichment workflow to replace the placeholder value with the real information.
Let’s take a look at the sample script. The first part cleans up the AppDynamics messages.
The next parts extract and define the fields we want.
Here, we’re defining the Moogsoft event object using the values extracted above.
Let’s copy this script into the payload section of the template. Set the type to application/json.
Our HTTP Request Template is ready. Here’s a tip. You need to save this first before you can test it. Now we can test it. What if the source event in Appdynamics said there’s a health rule violation...It says the event was successfully posted.
Let’s check on the Moogsoft side. Here it is, a health rule violation alert is received!
Lastly, we are going to set up trigger logic to invoke the webhook when AppDynamics detects an issue. To do that, we need to create an AppDynamics Policy. We’ll name it Moogsoft.
Here’s the trigger logic we want.
When there are health rule violations, we want to trigger an action... which is to invoke our HTTP request template and send data to Moogsoft.
Now you know how to set up a webhook integration with AppDynamics. In this demo we have configured only one application, but if you have a large AppDynamics implementation you can copy your configuration across multiple applications and controllers. Contact your AppDynamics support team for information about the AppDynamics Configuration Exporter utility. Thanks for watching!
BASIC | 4 MIN
This video demonstrates how to integrate AWS CloudWatch with Moogsoft.
Demo video: Integrate with AWS CloudWatch ►
In this video, we will go over how to integrate AWS Cloudwatch and send the data to Moogsoft.
The instruction documentation is conveniently located right here.
We are going to add our AWS credentials.
We need to create a policy and a role in AWS to complete the configuration. To do so, we need this JSON script from Moogsoft.
Grab this JSON script.
And now we are going to AWS.
Navigate to the policy section.
Paste the JSON script we grabbed from Moogsoft...
...and provide a meaningful name.
Now we got the policy. Let's create the role now.
"Another AWS Account" is what we need here.
We need to provide the Moogsoft AWS account number here.
Let's go back to Moogsoft. Copy this...
And paste it here.
Next, check "Require External ID". We need to grab it from Moogsoft. Here it is.
To this role we are creating, we need to associate the policy that we just created. Let me search by the name I gave that policy.
Lastly, provide a name for the role.
And now we have the role created.
Let's go back to Moogsoft and plug in the new role information. Here's the name of the role, and here's the AWS account number.
And the new credentials are now available for use.
Select a region, and test.
It works! Now, the data from Cloudwatch will start to flow into this instance of Moogsoft.
Thanks for watching!
BASIC | 4 MIN
After watching this video, you will be able to create an integration to ingest New Relic incidents and set up a webhook in New Relic.
Demo video: Send events to Moogsoft Cloud from New Relic ►
After watching this video, you will be able to create an integration to ingest New Relic incidents and set up a webhook in New Relic.
Here’s our scenario. We want to ingest alerts identified by New Relic so we can deduplicate and correlate them in Moogsoft.
Here’s the New Relic integration.
The API is provisioned. We’ll make it active after we finish setting it up.
Next we are going to get data from New Relic. Moogsoft will cache up to 10 New Relic events until the integration is active. These counters show how many payloads we’ve received - so far, in this case, none.
These are what we need.
Now going back to New Relic. We’ll start by configuring a notification channel to send issues to Moogsoft.
The channel type is webhook.
...and this is where you paste the Moogsoft Endpoint API URL we grabbed.
We need some custom request headers for our data payload. The content type is application/json.
And here is where we enter the apiKey.
We’ll use the default New Relic output, so we don’t need to set up a custom data payload and test these settings. Good!
Make sure to add this notification to the New Relic policy associated with the alerts you want to send to Moogsoft.
The test notification showed up in Moogsoft.
Here's the JSON payload from the test notification.
Field mapping is preconfigured to make it easy for you.
Optionally, at this point, you can also pick the deduplication key. Moogsoft uses the dedupe key value to combine events with the same context into a single alert, while updating the alert fields that change over time like description and status. By default Moogsoft uses the combination of source, class, service, and check fields. We’ll keep the defaults...
And test the deduplication key. Looks good! Now when another event with the same value for this field comes in, Moogsoft will update alert fields like latest event arrival time and severity, as needed.
Our integration activates with data flowing in.
Now we can see our data from New Relic in the Alert view. Thanks for watching!
BASIC | 4 MIN
After watching this video, you will be able to integrate Zabbix with Moogsoft Cloud. This video covers how to define a Moogsoft media type for Zabbix, set up a Moogsoft user, and define an action to send events. Note, this demo uses Zabbix version 5.4.
Demo video: Integrate with Zabbix ►
After watching this video, you will be able to configure an integration to send events from Zabbix into the Moogsoft Cloud.
This demo uses Zabbix version 5.4, which may be different from your instance. But it should orient you enough to make it easy. Let’s get started!
Name your integration, then grab the Moogsoft Endpoint and API key.
You also need a Zabbix media type file, which is available in our docs.
OK we have everything we need to make this happen. Let’s go to our Zabbix instance. Import the media type file we just downloaded.
Then add the API key and the Moogsoft Endpoint.
This part of the media type definition maps Zabbix field names to Moogsoft fields.
These fields are required.
Add optional fields based on your requirements.
This Javascript uses the mapped parameters to do further refinement, and compose the JSON payload to Moogsoft. You can edit the script to add optional fields.
Let’s test to make sure it works. Looks good. The media type task is done.
Next, we need to create a user to receive event notifications using our Moogsoft media type.
Then add the user to a group to grant the right level of access. The Moogsoft user won’t need access to the Zabbix front end, so we’ll choose this one.
The password is required but it won’t be used, so you can enter anything here.
This user needs to be able to use the Moogsoft media type. This will send event notifications directly to Moogsoft Cloud.
And this user needs a super admin role.
The user definition task is done.Now we have the pathway to Moogsoft Cloud all set up. Next we want to add the trigger logic to use that pathway when events happen. We’ll do that by defining an action. You can update existing actions or define custom actions to trigger on specific problems, but here we’ll simply define an action to send all events to Moogsoft.
If we wanted to restrict the severity of events or filter them in some other way, we could define a condition here.
Let’s say we want to send all events, so we won’t add a condition. Under Operations, specify what happens when an event, recovery, or update occurs.
We will send a message to the Moogsoft user using the media type Moogsoft Cloud.
We’ll repeat these settings for recovery and update events and add the action.
All set. The action definition task is done. Now we can see our Zabbix event data in Moogsoft Cloud. Thanks for watching!
BASIC | 5 MIN
Learn how to send sample event and metric data using the Events and Metrics APIs.
Demo video: Use APIs to send sample data into Moogsoft Cloud ►
Use APIs to send sample data into Moogsoft Cloud
In this video, you will learn how to use best practices in managing API keys, navigate the Moogsoft API docs to locate relevant information, and build Event and Metrics API commands.
Suppose you have a monitoring tool collecting metric and event data, and you want to route that data to Moogsoft using our Event and Metric APIs.
You can build the integration right in our API documentation. Let me show you how.
But first, a little bit about API key management.
You need an API key to push data to Moogsoft. The best practice here is not to use the primary API key as you see in the events API ingestion service page.
Rather, we recommend you create an API key that is revocable. This way you can grant and remove access to Moogsoft for different users and applications.
Here, you can create and revoke the keys as needed.
Let’s create one! Provide a meaningful name and description so other administrators would know what this key is about.
And the API key is generated for you to copy and use. For security reasons, you will not be able to reference it once you move away from this screen.
When and if you need to revoke it, you can do so here.
Next, we are going to use our Moogsoft API documentation. Here’s a Metrics API sample code that includes the required fields and optional fields for this endpoint.
Here’s the endpoint for the Metrics API.
This is sample code on how to implement this API in a variety of languages.
To make this code speak to your instance of Moogsoft, add the API key you generated and copied earlier. Now the sample code is updated to include your key.
Same with the parameters. As you edit them the sample code automatically gets updated. When ready, click on Try It to actually ping your instance with this sample code.
Here’s the response. Looks like we are successfully communicating with our instance of Moogsoft!
This is how you can build your integration with some level of assurance that it works.
Once you fully develop your integration, copy and paste the resulting code into a CLI.
We could also build shell scripts to add additional logic and conditionals that we simulated in the api docs site. Event API works the same way if you want to ingest both events and metrics.
Now you know how to use our API documentation to send sample data into Moogsoft, then use it to create your own code. Thanks for watching!
BASIC | 11 MIN
Your source payloads may include multiple events sent in a "batch" under a single object. In such cases, you can use batch processing to identify the top-level object and map event fields for individual events. Watch a video to learn how. Also, read the documentation on the topic here.
Demo video: Use the Create Your Own Integration API to send batch event data into Moogsoft Cloud ►
In this video, you will learn how to create a custom integration to process multi-event data payloads, use Moogsoft documentation to build sample data to test your integration, and understand the JSON structure of batch event payloads.
For this demo, let's use the Create Your Own Integration ingestion service, which allows you more flexibility over the fields to ingest, as well as the mapping.
Let me show you how it works. Give it a meaningful name, and add a description. Select Events.
Now we have a Moogsoft endpoint, as well as the API Key. We will need these in order to send over our batch events.
Now open the Moogsoft documentation to pick up an example request. This sample curl command posts two events to Moogsoft. Grab and paste it in an editor.
To make it work with our instance, add your Moogsoft endpoint and API key.
These are all the variables. We are passing in events...
...which are organized by this key.
There are a variety of elements in the sample code, and one thing we want to make sure of is that the default fields used for deduplication are available. By default, Moogsoft uses the combination of these field values to determine duplicate events, so naturally we want to make sure these fields are included.
Now copy the curl command, and paste it in a terminal.
It successfully came in.
Make sure to enable batch processing, and select the path to the event list. In this case, it’s events.key.
Next, map the payload fields to the Moogsoft target fields.
Since all the fields used for deduplication are in the example payload, we don’t need to do anything for the deduplication key.
Let’s test. Success!
Now you know how to process batch events using the Create Your Own Integration API. Thanks for watching!