The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose issues requests to its custom HTTP endpoint.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose transmits records from your DynamoDB table to its
custom HTTP endpoint.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Copy this value into the X-Amz-Firehose-Access-Key header when configuring your Amazon Data Firehose to
transmit records from your DynamoDB table to its custom HTTP endpoint.
The HTTP Basic authentication settings for uploading new data.If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
The connection settings for an Amazon S3 Data Source. These include the Amazon S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Twilio Segment Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthSettings
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettings
The unique ID column, if any. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated.deprecated: Will be removed; use Table Settings to define the primary key.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
The table introspection object.When setting up a Data Source, Propel may need to introspect tables in order to determine what tables and columns are available to create Data Pools from. The table introspection represents the lifecycle of this operation (whether it’s in-progress, succeeded, or failed) and the resulting tables and columns. These will be captured as table and column objects, respectively.
The table’s creator. This corresponds to the initiator of the table Introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.
Show DataPoolSyncInterval
The available Data Pool sync intervals. Specify unit of time between attempts to sync data from your data warehouse.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally, if you pause or resume syncing, this too can shift the syncing interval around.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.
Show DataPoolSyncInterval
The available Data Pool sync intervals. Specify unit of time between attempts to sync data from your data warehouse.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally, if you pause or resume syncing, this too can shift the syncing interval around.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The number of new records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of updated records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of deleted records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of filtered records contained within the Sync, due to issues such as a missing timestamp Dimension, if any are known to be invalid.deprecated: All records are considered to be processed; see processedRecords instead
The Metric Report connection object.It includes headers and rows for a single page of a report. It also allows paging forward and backward to other
pages of the report.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql insteadSee Filter
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The time granularity (hour, day, month, etc.) to aggregate the Metric values by.
Show TimeSeriesGranularity
The available time series granularities. Granularities define the unit of time to aggregate the Metric data for a time series query.For example, if the granularity is set to DAY, then the the time series query will return a label and a value for each day.If there are no records for a given time series granularity, Propel will return the label and a value of “0” so that the time series can be properly visualized.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The time series values for each group in groupBy, if specified.
Show TimeSeriesResponseGroup
The time series response object for a group specified in groupBy. It contains an array of time series labels and an array of Metric values for a particular group.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Additional filters to OR with this one. AND takes precedence over OR.
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.You can provide the filters in the form of SQL.
Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Metric Filters are present, all records will be included. To filter at query time, add Dimensions and use the filters property on the timeSeriesInput, counterInput, or leaderboardInput objects. There is no need to add filters to be able to filter at query time.deprecated: Use filterSql instead
Show Filter
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show CounterResponse
The counter response object. It contains a single Metric value for the given time range and Query Filters.
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show TimeSeriesResponse
The time series response object. It contains an array of time series labels and an array of Metric values for the given time range and Query Filters.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The time series values for each group in groupBy, if specified.
Show TimeSeriesResponseGroup
The time series response object for a group specified in groupBy. It contains an array of time series labels and an array of Metric values for a particular group.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The time series response object for a group specified in groupBy. It contains an array of time series labels and an array of Metric values for a particular group.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
Boosters allow you to optimize Metric Queries for a subset of commonly used Dimensions. A Metric can have one or many Boosters to optimize for the different Query patterns.Boosters can be understood as an aggregating index. The index is formed from left to right as follows:
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
An array of unique values for the Dimension, up to 1,000. Empty if the Dimension contains more than 1,000 unique values. Fetching unique values incurs query costs.
Boosters allow you to optimize Metric Queries for a subset of commonly used Dimensions. A Metric can have one or many Boosters to optimize for the different Query patterns.Boosters can be understood as an aggregating index. The index is formed from left to right as follows:
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
AddColumnToDataPoolJob scheduled for a specific Data Pool.The Add Column Job represents the asynchronous process of adding a column,
given its name and type, to a Data Pool. It tracks the process of adding a column
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the AddColumnToDataPoolJob belongs.
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The AddColumnToDataPool Job that was just created.
Show AddColumnToDataPoolJob
AddColumnToDataPoolJob scheduled for a specific Data Pool.The Add Column Job represents the asynchronous process of adding a column,
given its name and type, to a Data Pool. It tracks the process of adding a column
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the AddColumnToDataPoolJob belongs.
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
UpdateDataPoolRecords Job scheduled for a specific Data Pool.
The Update Data Pool Records Job represents the asynchronous process of updating records
given some filters, inside a Data Pool. It tracks the process of updating records
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the UpdateDataPoolRecords Job belongs
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The UpdateDataPoolRecords Job that was just created.
Show UpdateDataPoolRecordsJob
UpdateDataPoolRecords Job scheduled for a specific Data Pool.
The Update Data Pool Records Job represents the asynchronous process of updating records
given some filters, inside a Data Pool. It tracks the process of updating records
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the UpdateDataPoolRecords Job belongs
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Modifies a Data Pool Access Policy with the provided unique name, description, columns and rows. If any of the optional arguments are omitted, those properties will be unchanged on the Data Pool Access Policy.
Row-level filters that the Access Policy applies before executing queries. If not provided this property will not be modified.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Assign a Data Pool Access Policy to an Application.The Data Pool Access Policy will restrict the Data Pool rows and columns that the Application
can query. If the Data Pool has accessControlEnabled set to true, the Application
must have a Data Pool Access Policy assigned in order to query the Data Pool.An Application can have at most one Data Pool Access Policy assigned for a given Data Pool. If
an Application already has a Data Pool Access Policy for a given Data Pool, and you call this
mutation with another Data Pool Access Policy for the same Data Pool, the Application’s Data Pool Access
Policy will be replaced.
Unassign a Data Pool Access Policy from an Application.Once unassigned, whether the Application will be able to query the Data Pool is
controlled by the Data Pool’s accessControlEnabled property. If
accessControlEnabled is true, the Application will no longer be able to query the
Data Pool. If accessControlEnabled is false, the Application will be able to query
all data in the Data Pool, unrestricted.
The Application’s Propeller. If no Propeller is provided, Propel will set the Propeller to P1_X_SMALL.
Show Propeller
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application’s API authorization scopes. If specified, at least one scope must be provided; otherwise, all scopes will be granted to the Application by default.
Show ApplicationScope
The API operations an Application is authorized to perform.
ADMIN: Grant read/write access to Data Sources, Data Pools, Metrics and Policies.
APPLICATION_ADMIN: Grant read/write access to Applications.
DATA_POOL_QUERY: Grant read access to query Data Pools.
DATA_POOL_READ: Grant read access to read Data Pools.
DATA_POOL_STATS: Grant read access to fetch column statistics from Data Pools.
ENVIRONMENT_ADMIN: Grant read/write access to Environments.
METRIC_QUERY: Grant read access to query Metrics.
METRIC_STATS: Grant read access to fetch Dimension statistics from Metrics.
METRIC_READ: Grant read access to Metrics.
This does not allow querying Metrics. For that, see METRIC_QUERY.
Show ApplicationOrFailureResponse
The result of a mutation which creates or modifies an Application.If successful, an ApplicationResponse will be returned; otherwise, a
FailureResponse will be returned.
Modifies an Application with the provided unique name, description, Propeller, and scopes. If any of the optional arguments are omitted, those properties will be unchanged on the Application.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
ADMIN: Grant read/write access to Data Sources, Data Pools, Metrics and Policies.
APPLICATION_ADMIN: Grant read/write access to Applications.
DATA_POOL_QUERY: Grant read access to query Data Pools.
DATA_POOL_READ: Grant read access to read Data Pools.
DATA_POOL_STATS: Grant read access to fetch column statistics from Data Pools.
ENVIRONMENT_ADMIN: Grant read/write access to Environments.
METRIC_QUERY: Grant read access to query Metrics.
METRIC_STATS: Grant read access to fetch Dimension statistics from Metrics.
METRIC_READ: Grant read access to Metrics.
This does not allow querying Metrics. For that, see METRIC_QUERY.
Show ApplicationOrFailureResponse
The result of a mutation which creates or modifies an Application.If successful, an ApplicationResponse will be returned; otherwise, a
FailureResponse will be returned.
Creates a new Data Source from the given Snowflake database using the specified Snowflake account, warehouse, schema, username, and role.Returns the newly created Data Source (or an error message if creating the Data Source fails).
The Snowflake account. Only include the part before the “snowflakecomputing.com” part of your Snowflake URL (make sure you are in classic console, not Snowsight). For AWS-based accounts, this looks like “znXXXXX.us-east-2.aws”. For Google Cloud-based accounts, this looks like “ffXXXXX.us-central1.gcp”.
The Snowflake role. It should be “PROPELLER” if you used the default name in the setup script.
Show DataSourceOrFailureResponse
The result of a mutation which creates or modifies a DataSource.If successful, an DataSourceResponse will be returned; otherwise, a
FailureResponse will be returned.
Modifies a Data Source with the provided unique name, description, and connection settings. If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The Snowflake account. Only include the part before the “snowflakecomputing.com” part of your Snowflake URL (make sure you are in classic console, not Snowsight). For AWS-based accounts, this looks like “znXXXXX.us-east-2.aws”. For Google Cloud-based accounts, this looks like “ffXXXXX.us-central1.gcp”. If not provided this property will not be modified.
The Snowflake warehouse name. It should be “PROPELLING” if you used the default name in the setup script. If not provided this property will not be modified.
The Snowflake role. It should be “PROPELLER” if you used the default name in the setup script. If not provided this property will not be modified.
Show DataSourceOrFailureResponse
The result of a mutation which creates or modifies a DataSource.If successful, an DataSourceResponse will be returned; otherwise, a
FailureResponse will be returned.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The table introspection object.When setting up a Data Source, Propel may need to introspect tables in order to determine what tables and columns are available to create Data Pools from. The table introspection represents the lifecycle of this operation (whether it’s in-progress, succeeded, or failed) and the resulting tables and columns. These will be captured as table and column objects, respectively.
The result of a mutation which creates or modifies a DataSource.If successful, an DataSourceResponse will be returned; otherwise, a
FailureResponse will be returned.
The table’s primary timestamp column.Propel uses the primary timestamp to order and partition your data in Data Pools. It’s part of what makes Propel
fast for larger data sets. It will also serve as the time dimension for your Metrics.If you do not provide a primary timestamp column, you will need to supply an alternate timestamp when querying your
Data Pool or its Metrics using the TimeRangeInput.
Show TimestampInput
The fields to specify the Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
Enables or disables access control for the Data Pool.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
Show TableSettingsInput
A Data Pool’s table settings.These describe how the Data Pool’s table is created in ClickHouse.
The ClickHouse table engine for the Data Pool’s table.This field is optional. A default will be chosen based on the Data Pool’s timestamp and uniqueId values, if specified.See TableEngineInput
The PARTITION BY clause for the Data Pool’s table.This field is optional. A default will be chosen based on the Data Pool’s timestamp and uniqueId values, if specified.
The PRIMARY KEY clause for the Data Pool’s table.This field is optional. A default will be chosen based on the Data Pool’s timestamp and uniqueId values, if specified.
The ORDER BY clause for the Data Pool’s table.This field is optional. A default will be chosen based on the Data Pool’s timestamp and uniqueId values, if specified.
The Data Pool’s optional tenant ID column. The tenant ID column is used to control access to your data with access policies.deprecated: Will be removed; use Data Pool Access Policies instead
Show TenantInput
The fields to specify the Data Pool’s tenant ID column. The tenant ID column is used to control access to your data with access policies.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueIdInput
The fields to specify the Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Modifies a Data Pool with the provided unique name, description, and data retention time. If any of the optional arguments are omitted, those properties will be unchanged on the Data Pool.
The table’s primary timestamp column.Propel uses the primary timestamp to order and partition your data in Data Pools. It’s part of what makes Propel
fast for larger data sets. It will also serve as the time dimension for your Metrics.If you do not provide a primary timestamp column, you will need to supply an alternate timestamp when querying your
Data Pool or its Metrics using the TimeRangeInput.
Show TimestampInput
The fields to specify the Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
Enables or disables access control for the Data Pool.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Show DataPoolOrFailureResponse
The result of a mutation which creates or modifies a Data Pool.If successful, an DataPoolResponse will be returned; otherwise, a
FailureResponse will be returned.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The result of a mutation which creates or modifies a Data Pool.If successful, an DataPoolResponse will be returned; otherwise, a
FailureResponse will be returned.
The result of a mutation which creates or modifies a Data Pool.If successful, an DataPoolResponse will be returned; otherwise, a
FailureResponse will be returned.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Filters. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
Modifies a Metric by ID with the provided unique name, description, and Dimensions. If any of the optional arguments are omitted, those properties will be unchanged on the Metric.
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show CounterResponse
The counter response object. It contains a single Metric value for the given time range and Query Filters.
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show TimeSeriesResponse
The time series response object. It contains an array of time series labels and an array of Metric values for the given time range and Query Filters.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Boosters allow you to optimize Metric Queries for a subset of commonly used Dimensions. A Metric can have one or many Boosters to optimize for the different Query patterns.Boosters can be understood as an aggregating index. The index is formed from left to right as follows:
Deletes a Booster by ID and then returns the same ID if the Booster was deleted successfully.A Booster significantly improves the query performance for a Metric.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The AddColumnToDataPool Job that was just created.
Show AddColumnToDataPoolJob
AddColumnToDataPoolJob scheduled for a specific Data Pool.The Add Column Job represents the asynchronous process of adding a column,
given its name and type, to a Data Pool. It tracks the process of adding a column
until it is finished, showing the progress and the outcome when it is finished.
The UpdateDataPoolRecords Job that was just created.
Show UpdateDataPoolRecordsJob
UpdateDataPoolRecords Job scheduled for a specific Data Pool.
The Update Data Pool Records Job represents the asynchronous process of updating records
given some filters, inside a Data Pool. It tracks the process of updating records
until it is finished, showing the progress and the outcome when it is finished.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The number of new records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of updated records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of deleted records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of filtered records contained within the Sync, due to issues such as a missing timestamp Dimension, if any are known to be invalid.deprecated: All records are considered to be processed; see processedRecords instead
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The number of new records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of updated records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of deleted records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of filtered records contained within the Sync, due to issues such as a missing timestamp Dimension, if any are known to be invalid.deprecated: All records are considered to be processed; see processedRecords instead
By default, a destination Data Pool with default settings will be created for the Materialized View;
however, you can customize the destination Data Pool (or point to an existing Data Pool), by setting
this field. Use this to target an existing Data Pool or the engine settings of a new Data Pool.
Show CreateMaterializedViewDestinationInput
The fields for targeting an existing Data Pool or a new Data Pool.
If specified, the Materialized View will target an existing Data Pool.
Ensure the Data Pool’s schema is compatible with your Materialized View’s SQL statement.See DataPoolInput
By default, a Materialized View only applies to records added after its creation. This option allows
to backfill all the data that was present before the Materialized View creation.
Deletes a Materialized View and returns its ID if the Materialized View was deleted successfully.
Note that deleting a Materialized View does not delete its target Data Pool. If you want to delete its target
Data Pool, you must use the deleteDataPool mutation.
Creates a new Amazon Data Firehose Data Source from the given settings.Returns the newly created Data Source (or an error message if creating the Data Source fails).
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose issues requests to its custom HTTP endpoint.See HttpBasicAuthInput
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettingsInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Modifies the Data Source by the ID or unique name provided with the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose issues requests to its custom HTTP endpoint. If not provided this property will not be modified.See HttpBasicAuthInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Creates a new Amazon DynamoDB Data Source from the given settings.Returns the newly created Data Source (or an error message if creating the Data Source fails).
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose transmits records from your DynamoDB table to its
custom HTTP endpoint.See HttpBasicAuthInput
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettingsInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Modifies the Data Source by the ID or unique name provided with the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose transmits records from your DynamoDB table to its
custom HTTP endpoint. If not provided this property will not be modified.See HttpBasicAuthInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation creates a new ClickHouse Data Source.The mutation returns the newly created Data Source (or an error message if creating the Data Source fails).
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation selects a Data Source by its ID or unique name and modifies it to have the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The HTTP Basic authentication settings for uploading new data.If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation selects a Data Source by its ID or unique name and modifies it to have the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The HTTP Basic authentication settings for uploading new data.If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it. If not provided this property will not be modified.See HttpBasicAuthInput
Set this to false to disable HTTP Basic authentication. Any previously stored HTTP Basic authentication settings will be cleared out. If not provided this property will not be modified.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation creates a new Kafka Data Source.The mutation returns the newly created Data Source (or an error message if creating the Data Source fails).
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation selects a Data Source by its ID or unique name and modifies it to have the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Selects a Data Source by its ID or unique name and modifies it to have the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Creates a new Amazon S3 Data Source pointed at the specified Amazon S3 bucket.Returns the newly created Data Source (or an error message if creating the Data Source fails).
The connection settings for an Amazon S3 Data Source. These include the Amazon S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
This mutation selects a Data Source by its ID or unique name and modifies it to have the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The Amazon S3 Data Source’s new connection settings. If not provided this property will not be modified.
Show PartialS3ConnectionSettingsInput
The connection settings for an Amazon S3 Data Source. These include the Amazon S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Creates a new Twilio Segment Data Source from the given settings.Returns the newly created Data Source (or an error message if creating the Data Source fails).
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Twilio Segment Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthInput
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettingsInput
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Modifies the Data Source by the ID or unique name provided with the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The HTTP basic authentication settings for the Twilio Segment Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it. If not provided this property will not be modified.See HttpBasicAuthInput
Set this to false to disable HTTP Basic authentication. Any previously stored HTTP Basic authentication settings will be cleared out. If not provided this property will not be modified.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Creates a new Webhook Data Source from the given settings.Returns the newly created Data Source (or an error message if creating the Data Source fails).
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.See HttpBasicAuthInput
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.See TableSettingsInput
The unique ID column, if any. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Modifies the Data Source by the ID or unique name provided with the given unique name, description, and connection settings.If any of the optional arguments are omitted, those properties will be unchanged on the Data Source.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it. If not provided this property will not be modified.See HttpBasicAuthInput
Set this to false to disable HTTP Basic authentication. Any previously stored HTTP Basic authentication settings will be cleared out. If not provided this property will not be modified.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The data gathered by the SQL query. The data is returned in an N x M matrix format, where the
first dimension are the rows retrieved, and the second dimension are the columns. Each cell
can be either a string or null, and the string can represent a number, text, date or boolean value.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Data Grid connection.It includes headers and rows for a single page of a Data Grid table. It also allows paging forward and backward to other
pages of the Data Grid table.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
Returns the Applications within the Environment.The applications query uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
Returns the Data Source specified by the given ID.
Copy
query { dataSource(id: "DSOXXXXX") { id uniqueName type tables (first: 100){ nodes { id name columns (first: 100) { nodes { name type isNullable supportedDataPoolColumnTypes } } } } }}
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Returns the Data Sources within the Environment.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source. Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads.The dataSources query uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Returns the Data Pool specified by the given ID.A Data Pool is a cached table hydrated from your data warehouse optimized for high-concurrency and low-latency queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
Returns the Data Pool specified by the given unique name.A Data Pool is a cached table hydrated from your data warehouse optimized for high-concurrency and low-latency queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
A list of setup tasks performed on the Data Pool during its most recent setup attempt.
Show DataPoolSetupTask
The Data Pool Setup Task object.Data Pool Setup Tasks are executed when setting up your Data Pool. They ensure Propel will be able to sync records from your Data Source to your Data Pool.The exact Setup Tasks to perform vary by Data Source. For example, Data Pools pointing to a Snowflake-backed Data Sources will have their own specific Setup Tasks.
The syncing interval.Note that the syncing interval is approximate. For example, setting the syncing interval to EVERY_1_HOUR
does not mean that syncing will occur exactly on the hour. Instead, the syncing interval starts relative to
when the Data Pool goes LIVE, and Propel will attempt to sync approximately every hour. Additionally,
if you pause or resume syncing, this too can shift the syncing interval around.See DataPoolSyncInterval
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
Response returned by the validateExpression query for validating expressions in Custom Metrics.Returns whether the expression is valid or not with a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.
Show UniqueId
A Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
Returns the Data Pools within the Environment.A Data Pool is a cached table hydrated from your data warehouse optimized for high-concurrency and low-latency queries. Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads.The dataPools query uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Returns the Materialized Views within the Environment.The materializedViews query uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Returns the Data Pool Access Policy specified by the given ID.A Data Pool Access Policy limits the data that Applications can access within a Data Pool.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show CounterResponse
The counter response object. It contains a single Metric value for the given time range and Query Filters.
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show TimeSeriesResponse
The time series response object. It contains an array of time series labels and an array of Metric values for the given time range and Query Filters.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
COUNT: Counts the number of records that matches the Metric Filters. For time series, it will count the values for each time granularity.
SUM: Sums the values of the specified column for every record that matches the Metric Filters. For time series, it will sum the values for each time granularity.
COUNT_DISTINCT: Counts the number of distinct values in the specified column for every record that matches the Metric Filters. For time series, it will count the distinct values for each time granularity.
AVERAGE: Averages the values of the specified column for every record that matches the Metric Filters. For time series, it will average the values for each time granularity.
MIN: Selects the minimum value of the specified column for every record that matches the Metric Filters. For time series, it will select the minimum value for each time granularity.
MAX: Selects the maximum value of the specified column for every record that matches the Metric Filters. For time series, it will select the maximum value for each time granularity.
CUSTOM: Aggregates values based on the provided custom expression.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
Query the Metric in counter format. Returns the Metric’s value for the given time range and filters.deprecated: Use the top-level counter query instead
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show CounterResponse
The counter response object. It contains a single Metric value for the given time range and Query Filters.
Query the Metric in time series format. Returns arrays of timestamps and the Metric’s values for the given time range and filters.deprecated: Use the top-level timeSeries query instead
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show TimeSeriesResponse
The time series response object. It contains an array of time series labels and an array of Metric values for the given time range and Query Filters.
Query the Metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the Metric’s corresponding values for the given time range and filters.deprecated: Use the top-level leaderboard query instead
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql insteadSee FilterInput
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Returns the Metrics within the Environment.A Metric is a business indicator measured over time. Each Metric is associated with one Data Pool, which is a cached table hydrated from your data warehouse optimized for high-concurrency and low-latency queries. Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads.The metrics query uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
Boosters allow you to optimize Metric Queries for a subset of commonly used Dimensions. A Metric can have one or many Boosters to optimize for the different Query patterns.Boosters can be understood as an aggregating index. The index is formed from left to right as follows:
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The statistics for the dimension values. Fetching statistics incurs query costs.deprecated: Issue normal queries for calculating statsSee DimensionStatistics
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The number of new records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of updated records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of deleted records contained within the Sync, if known. This excludes filtered records.deprecated: All records are considered to be processed; see processedRecords instead
The number of filtered records contained within the Sync, due to issues such as a missing timestamp Dimension, if any are known to be invalid.deprecated: All records are considered to be processed; see processedRecords instead
The table’s creator. This corresponds to the initiator of the table Introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
Build a report, or table, consisting of multiple Metrics broken down by one-or-more dimensions.The first few columns of the report are the dimensions you choose to break down by. The subsequent columns are the
Metrics you choose to query. By default, the report sorts on the first Metric in descending order, but you can
configure this with the orderByMetric and sort inputs.Finally, reports use cursor-based pagination. You can control page size with the first and
last inputs.
The fields for querying a Metric Report.A Metric Report is a table whose columns include dimensions and Metric values, calculated over a given time range.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The index of the column to order the Metric Report by. The index is 1-based and defaults to the first Metric column. In other words, by default, reports are ordered by the first Metric; however, you can order by the second Metric, third Metric, etc., by overriding the orderByColumn input. You can also order by dimensions this way.
Additional filters to OR with this one. AND takes precedence over OR.
Show MetricReportConnection
The Metric Report connection object.It includes headers and rows for a single page of a report. It also allows paging forward and backward to other
pages of the report.
An ordered array of columns. Each column contains the dimension and Metric values for a single row, as defined in the report input. Use this to display a single row within your table.
The data gathered by the SQL query. The data is returned in an N x M matrix format, where the
first dimension are the rows retrieved, and the second dimension are the columns. Each cell
can be either a string or null, and the string can represent a number, text, date or boolean value.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
Additional filters to OR with this one. AND takes precedence over OR.
Show DataGridConnection
The Data Grid connection.It includes headers and rows for a single page of a Data Grid table. It also allows paging forward and backward to other
pages of the Data Grid table.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The fields for querying a Metric in time series format.A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The time granularity (hour, day, month, etc.) to aggregate the Metric values by.
Show TimeSeriesGranularity
The available time series granularities. Granularities define the unit of time to aggregate the Metric data for a time series query.For example, if the granularity is set to DAY, then the the time series query will return a label and a value for each day.If there are no records for a given time series granularity, Propel will return the label and a value of “0” so that the time series can be properly visualized.
The Query Filters to apply before retrieving the time series data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
The time series values for each group in groupBy, if specified.
Show TimeSeriesResponseGroup
The time series response object for a group specified in groupBy. It contains an array of time series labels and an array of Metric values for a particular group.
Query a metric in leaderboard format. Returns a table (array of rows) with the selected dimensions and the metric’s corresponding values for the given time range and filters.
The fields for querying a Metric in leaderboard format.A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.If both relative and absolute time ranges are provided, the relative time range will take precedence.If a LAST_N relative time period is selected, an n ≥ 1 must be provided. If no n is provided or n < 1, a BAD_REQUEST error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any.
Set this to filter on an alternative timestamp field.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the leaderboard data. If no Query Filters are provided, all data is included.deprecated: Use filterSql instead
Show FilterInput
The fields of a filter.You can construct more complex filters using and and or. For example, to construct a filter equivalent to
Copy
(value > 0 AND value <= 100) OR status = "confirmed"
Additional filters to OR with this one. AND takes precedence over OR.
Show LeaderboardResponse
The leaderboard response object. It contains an array of headers and a table (array of rows) with the selected Dimensions and corresponding Metric values for the given time range and Query Filters.
An ordered array of rows. Each row contains the Dimension values and the corresponding Metric value. A Dimension value can be empty. A Metric value will never be empty.
Returns the Deletion Job specified by the given ID.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool.
Deletion Job scheduled for a specific Data Pool.The Deletion Job represents the asynchronous process of deleting data
given some filters inside a Data Pool. It tracks the deletion process
until it is finished, showing the progress and the outcome when it is finished.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Returns the AddColumnToDataPoolJob specified by the given ID.The AddColumnToDataPoolJob represents the asynchronous process of adding
a column, given its name and type, to a Data Pool.
AddColumnToDataPoolJob scheduled for a specific Data Pool.The Add Column Job represents the asynchronous process of adding a column,
given its name and type, to a Data Pool. It tracks the process of adding a column
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the AddColumnToDataPoolJob belongs.
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Returns the UpdateDataPoolRecords Job specified by the given ID.The UpdateDataPoolRecords Job represents the asynchronous process of updating
records inside a Data Pool.
UpdateDataPoolRecords Job scheduled for a specific Data Pool.
The Update Data Pool Records Job represents the asynchronous process of updating records
given some filters, inside a Data Pool. It tracks the process of updating records
until it is finished, showing the progress and the outcome when it is finished.
Environment to which the UpdateDataPoolRecords Job belongs
Show Environment
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The Application object.Propel Applications represent the web or mobile app you are building. They provide the API credentials that allow your client- or server-side app to access the Propel API. The Application’s Propeller determines the speed and cost of your Metric Queries.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A Propeller determines your Application’s query processing power. The larger the Propeller, the faster the queries and the higher the cost. Every Propel Application (and therefore every set of API credentials) has a Propeller that determines the speed and cost of queries.
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.See ColumnType
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.See ColumnType
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.
The column object.Once a table introspection succeeds, it creates a new table object for every table it introspected. Within each table object, it also creates a column object for every column it introspected.
The column’s creator. This corresponds to the initiator of the table introspection. It can be either a User ID, an Application ID, or “system” if it was created by Propel.
This is the suggested Data Pool column type to use when converting this Data Source column to a Data Pool column.
Propel makes this suggestion based on the Data Source column type. If the Data Source column type is unsupported, this field returns null.Sometimes, you know better which Data Pool column type to convert to. In these cases, you can refer
to supportedDataPoolColumnTypes for the full set of supported conversions.
This is the set of supported Data Pool column types you can use when converting this Data Source column to a Data Pool column. If the Data Source column type is unsupported, this field returns an empty array.For example, a numeric Data Source column type could be converted to a narrower or wider numeric Data Pool column type; a string-valued Data Source column type could be mapped to a date or timestamp Data Pool column type.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Data Source object.A Data Source is a connection to your data warehouse. It has the necessary connection details for Propel to access Snowflake or any other supported Data Source.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
A list of checks performed on the Data Source during its most recent connection attempt.
Show DataSourceCheck
The Data Source Check object.Data Source Checks are executed when setting up your Data Source. They check that Propel will be able to receive data and setup Data Pools.The exact Checks to perform vary by Data Source. For example, Snowflake-backed Data Sources will have their own specific Checks.
If you list Data Pools via the dataPools field on a Data Source, you will get Data Pools for the Data Source.The dataPools field uses cursor-based pagination typical of GraphQL APIs. You can use the pairs of parameters first and after or last and before to page forward or backward through the results, respectively.For forward pagination, the first parameter defines the number of results to return, and the after parameter defines the cursor to continue from. You should pass the cursor for the last result of the current page to after.For backward pagination, the last parameter defines the number of results to return, and the before parameter defines the cursor to continue from. You should pass the cursor for the first result of the current page to before.
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
The Environments object.Environments are independent and isolated Propel workspaces for development, staging (testing), and production workloads. Environments are hosted in a specific region, initially in us-east-2 only.
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Whether the Data Pool has access control enabled or not.If the Data Pool has access control enabled, Applications must be assigned Data Pool Access
Policies in order to query the Data Pool and its Metrics.
Validates a custom expression against the Data Pool’s available columns. If the provided expression is invalid, the ValidateExpressionResult response will contain a reason explaining why.
The Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.deprecated: Will be removed; use table settings to define the primary key.See UniqueId
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose issues requests to its custom HTTP endpoint.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
Show TableSettings
A Data Pool’s table settings.These describe how the Data Pool’s table is created in ClickHouse.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
HTTP basic access authentication credentials. You must configure these same credentials to be included in the
X-Amz-Firehose-Access-Key header when Amazon Data Firehose transmits records from your DynamoDB table to its
custom HTTP endpoint.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp value, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
Show TableSettings
A Data Pool’s table settings.These describe how the Data Pool’s table is created in ClickHouse.
Copy this value into the X-Amz-Firehose-Access-Key header when configuring your Amazon Data Firehose to
transmit records from your DynamoDB table to its custom HTTP endpoint.
The HTTP Basic authentication settings for uploading new data.If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.
The connection settings for an Amazon S3 Data Source. These include the Amazon S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Twilio Segment Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
Show TableSettings
A Data Pool’s table settings.These describe how the Data Pool’s table is created in ClickHouse.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp and uniqueId values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
Show TableSettings
A Data Pool’s table settings.These describe how the Data Pool’s table is created in ClickHouse.
The unique ID column, if any. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated.deprecated: Will be removed; use Table Settings to define the primary key.