The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose issues requests to its custom HTTP endpoint.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose transmits records from your DynamoDB table to its
custom HTTP endpoint.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Copy this value into the X-Amz-Firehose-Access-Key header when configuring your Amazon Data Firehose to
transmit records from your DynamoDB table to its custom HTTP endpoint.
The HTTP Basic authentication settings for uploading new data.If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
The connection settings for an Amazon S3 Data Source. These include the Amazon S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Twilio Segment Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
The unique ID column, if any. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated.deprecated: Will be removed; use Table Settings to define the primary key.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
The table introspection object.When setting up a Data Source, Propel may need to introspect tables in order to determine what tables and columns are available to create Data Pools from. The table introspection represents the lifecycle of this operation (whether it’s in-progress, succeeded, or failed) and the resulting tables and columns. These will be captured as table and column objects, respectively.
The table’s creator. This corresponds to the initiator of the table Introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.
Show DataPoolSyncInterval
The available Data Pool sync intervals. Specify unit of time between attempts to sync data from your data warehouse.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally, if you pause or resume syncing, this too can shift the syncing interval around.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.
Show DataPoolSyncInterval
The available Data Pool sync intervals. Specify unit of time between attempts to sync data from your data warehouse.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally, if you pause or resume syncing, this too can shift the syncing interval around.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The number of new records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of updated records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of deleted records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of filtered records contained within the Sync, due to issues such as a missing timestamp Dimension, if any are known to be invalid.deprecated: All records are considered to be processed; see processedRecords instead
The Metric Report connection object.It includes headers and rows for a single page of a report. It also allows paging forward and backward to other
pages of the report.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The time granularity (hour, day, month, etc.) to aggregate the Metric values by.
Show TimeSeriesGranularity
The available time series granularities. Granularities define the unit of time to aggregate the Metric data for a time series query.For example, if the granularity is set to DAY, then the the time series query will return a label and a value for each day.If there are no records for a given time series granularity, Propel will return the label and a value of “0” so that the time series can be properly visualized.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The time series values for each group in groupBy, if specified.
Show TimeSeriesResponseGroup
The time series response object for a group specified in groupBy. It contains an array of time series labels and an array of Metric values for a particular group.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Additional filters to OR with this one. AND takes precedence over OR.
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"