Skip to content

Commit

Permalink
Merge pull request MicrosoftDocs#51 from bwren/dcr-structure
Browse files Browse the repository at this point in the history
DCR structure update
  • Loading branch information
v-regandowner authored Oct 16, 2024
2 parents 3eee692 + d7c6e1c commit cb0fc7e
Show file tree
Hide file tree
Showing 8 changed files with 104 additions and 70 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -230,7 +230,7 @@ The settings for **collection frequency** and **namespace filtering** in the DCR
When you specify the tables to collect using CLI or ARM, you specify a stream name that corresponds to a particular table in the Log Analytics workspace. The following table lists the stream name for each table.

> [!NOTE]
> If you're familiar with the [structure of a data collection rule](../essentials/data-collection-rule-structure.md), the stream names in this table are specified in the [dataFlows](../essentials/data-collection-rule-structure.md#dataflows) section of the DCR.
> If you're familiar with the [structure of a data collection rule](../essentials/data-collection-rule-structure.md), the stream names in this table are specified in the [Data flows](../essentials/data-collection-rule-structure.md#data-flows) section of the DCR.
| Stream | Container insights table |
| --- | --- |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Transformations are implemented in [data collection rules (DCRs)](../essentials/

## Data sources
The [dataSources section of the DCR](../essentials/data-collection-rule-structure.md#datasources) defines the different types of incoming data that the DCR will process. For Container insights, this is the Container insights extension, which includes one or more predefined `streams` starting with the prefix *Microsoft-*.
The [Data sources section of the DCR](../essentials/data-collection-rule-structure.md#data-sources) defines the different types of incoming data that the DCR will process. For Container insights, this is the Container insights extension, which includes one or more predefined `streams` starting with the prefix *Microsoft-*.

The list of Container insights streams in the DCR depends on the [Cost preset](container-insights-cost-config.md#cost-presets) that you selected for the cluster. If you collect all tables, the DCR will use the `Microsoft-ContainerInsights-Group-Default` stream, which is a group stream that includes all of the streams listed in [Stream values](container-insights-cost-config.md#stream-values). You must change this to individual streams if you're going to use a transformation. Any other cost preset settings will already use individual streams.

Expand Down Expand Up @@ -55,7 +55,7 @@ The sample below shows the `Microsoft-ContainerInsights-Group-Default` stream. S


## Data flows
The [dataFlows section of the DCR](../essentials/data-collection-rule-structure.md#dataflows) matches streams with destinations that are defined in the `destinations` section of the DCR. Table names don't have to be specified for known streams if the data is being sent to the default table. The streams that don't require a transformation can be grouped together in a single entry that includes only the workspace destination. Each will be sent to its default table.
The [Data flows section of the DCR](../essentials/data-collection-rule-structure.md#data-flows) matches streams with destinations that are defined in the `destinations` section of the DCR. Table names don't have to be specified for known streams if the data is being sent to the default table. The streams that don't require a transformation can be grouped together in a single entry that includes only the workspace destination. Each will be sent to its default table.

Create a separate entry for streams that require a transformation. This should include the workspace destination and the `transformKql` property. If you're sending data to an alternate table, then you need to include the `outputStream` property which specifies the name of the destination table.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ ms.reviwer: nikeist
A data collection endpoint (DCE) is a connection where data sources send collected data for processing and ingestion into Azure Monitor. This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment.

## When is a DCE required?
Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. DCRs for supported scenarios created after this date include their own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#endpoints) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios.
Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. DCRs for supported scenarios created after this date include their own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#properties) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios.

Endpoints cannot be added to an existing DCR, but you can keep using any existing DCRs with existing DCEs. If you want to move to a DCR endpoint, then you must create a new DCR to replace the existing one. A DCR with endpoints can also use a DCE. In this case, you can choose whether to use the DCE or the DCR endpoints for each of the clients that use the DCR.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -287,7 +287,7 @@ The sample [data collection rule](../essentials/data-collection-rule-overview.md
- Applies a [transformation](../essentials//data-collection-transformations.md) to the incoming data.

> [!NOTE]
> Logs ingestion API requires the [logsIngestion](../essentials/data-collection-rule-structure.md#endpoints) property which includes the URL of the endpoint. This property is added to the DCR after it's created.
> Logs ingestion API requires the [logsIngestion](../essentials/data-collection-rule-structure.md#properties) property which includes the URL of the endpoint. This property is added to the DCR after it's created.
```json
{
Expand Down
Loading

0 comments on commit cb0fc7e

Please sign in to comment.