Data Lake

The Datalake serves as a central and flexible data source and is thus the basis of the entire SIEM. As a central repository, it brings together all raw data collected by agents and collector relays, indexes and groups it, and then presents it in a normalized form. In this way, the Datalake creates the basis for detecting patterns, anomalies and threats. By intelligently reusing the data, security incidents can be effectively detected, investigations can be conducted, and trends can be analyzed.

With the option to set start and end times, you can customize the temporal capture of your view. The graph below provides you with an overview of the frequency of entries captured in the defined time period.

The layout of the log view can be customized using the left icon on the right. Here you can define how comprehensive the preview of the logs should be and whether field names should be abbreviated. By clicking on the icon on the right side you can easily switch to full screen mode, to leave it, just press the escape key.

For the meaningful further processing of the numerous data it is crucial to find out the results that are relevant for you. For this purpose, you will find a drop-down menu on the left side. The items Generic, Enginsight, Standard and Event Relay are specified by default in the platform. All other fields are filled with information depending on your integrated systems and are listed product grouped. At the very bottom of the list you can still find the filters of the set up collectors.

Fields

Generic

  • Contains basic information across all types of logs. The difference with more specific log sources is that generic logs are less context specific and thus can cover a wide range of events. Gen fields are uniform templates that can be overlaid on all existing logs, recognize the same information (username, geoip.city, geoip. continent, ...) and standardize.

Enginsight

  • Contains all event logs captured by the Enginsight components (Defence FIM, IDS and Shield).

Standard

  • Contains all log information that has been individually typed and assigned by extractors.

Event Relais

  • The content is related to the previous configuration of the event relay and contains uniform RFC fields.

Below the product-specific tabs, the filters of the set-up collectors can also be found.

Currently we support the following firewall providers:

  • Barracuda Networks

  • Citrix Systems

  • ESET

  • F5 Networks

  • Fortinet

  • G Data CyberDefense

  • Lancom

  • PfSense/OPNSense

  • SonicWall

  • Sophos

  • Trend Micro

  • WatchGuard Technologies

  • genua GmbH

Filters

Use the filter function in the data lake to display specific results. You have two options here. Use the free input field at the top of your data lake to create custom filters, or click on the fields to the left of the data lake.

Clicking on a field will show you all the filter options behind it. Select “+” behind an entry to set it as a filter. Select “-” to explicitly exclude results with this value.

Add Filters

After selecting a field from the left side menu next to the data lake, you can add more granular filters. To do this, click on the filter icon at the top right of the pop-up window. Here you will have the option to add a new filter.

Fieldname

  • Here you will find the name of the previously selected field.

Operator Choose between the following options:

  • Equal Displays only records where the field value exactly matches the specified value.

  • Unequal Excludes records where the field value matches the specified value.

  • Exists Displays all records in which the field exists at all (regardless of the value).

  • Not exists Displays only records in which the field is missing or does not contain a value.

Value Here you enter the comparison value used to check the field content using the selected operator.

Further Fields

  • Exact Match Only identical values with matching upper and lower case letters are taken into account.

  • Case insensitive Identical values are considered, regardless of upper and lower case.

  • Regular expression Flexible pattern search using regular expressions. If this option is selected, the regex must be entered in the Value field.

You can also add additional filters by clicking on a field in the data lake. Under Operators, select whether the selected field in your filter is: equal, not equal, exists, or does not exist. If necessary, activate the full-text search or adjust the value of your field manually.

Event Stream

Event Streams are at the heart of the Data Lake and serve as powerful data channels that enable continuous collection, aggregation and analysis of event data from a wide variety of sources. Continuous, real-time monitoring allows SIEM to quickly and reliably identify patterns and detect anomalies and potential security risks or unusual behavior. Use Event Streams to effectively collect, correlate and analyze event data from multiple sources to protect your IT and proactively respond to security threats. Whether for compliance and reporting issues, real-time monitoring of user activity, or incident response and forensics. Event Streams support you in all these areas and help you strengthen your entire IT security structure in the long term.

Create Stream

Use the filter drop-down menu on the left to efficiently filter searched log entries and reliably create event streams. Clicking on a filter opens a detailed view of the available variations. Right next to each variable is a plus icon indicating that the variable should be included in the results, and a minus icon indicating that results with that specific variation should be excluded. Add filters to display logs with specific information. For example, generate views of successful SSH logins and then create new streams. The list of all currently applied filters can be found above the graphic. Reset all filters by clicking on "create new stream" to create new views. Streams that have already been created or predefined can be accessed via Open Stream. Update your stream via the corresponding button or save your set filters via "Save stream". In this case, assign a unique name so that you can quickly find the stream again later if necessary.

Last updated

Was this helpful?