Introduction
This guide provides instructions for visualizing Develocity build data in Grafana dashboards by exporting the build data as build models to AWS S3 and querying them via AWS Athena.
If you are looking to visualize Develocity build data without using AWS services, you can do that by installing the Develocity Reporting Kit. |
Build data
A Build Scan® is a persisted record of a single build’s captured data. Each Build Scan consists of thousands to millions of very fine-grained events. Build models aggregate these events into higher-level structures to expose easily consumable, summarized information about the build.
The build models can be exported to AWS S3 to serve as input to an external big data engine, and they can be consumed via the Develocity API. This guide focuses on exporting the build models to AWS S3 and their consumption by AWS Athena.
Build models are available for only Gradle and Maven builds, currently. |
The |
Infrastructure
The process for setting up the infrastructure at a high level is as follows:
-
Configure Develocity to export the build models to AWS S3.
-
Create a database in AWS Glue containing the table of the build models.
-
Create a workgroup in AWS Athena to query the table with the build models.
-
Create a table in AWS Athena that describes the schema of the build models.
-
Verify the table setup by issuing a build model query from AWS Athena.
-
Create a data source in Grafana for the table you created in AWS Athena.
-
Create a Grafana dashboard that visualizes data from the build models.
-
Import the Grafana dashboards provided by Gradle.
Prerequisites
The prerequisites are as follows:
-
The Administer Develocity permission to enable the build model export to AWS S3.
-
An AWS account with permissions to configure AWS Athena and AWS Glue.
-
A Grafana instance (on-premise or cloud) with the Athena plugin installed.
Configure Develocity
Enable S3 object storage
Develocity can export the build models to S3. This requires first enabling and configuring S3 object storage in Develocity via Helm chart. See Appendix A for the minimal configuration. You can skip this step if you already store your Build Scan data in S3.
This guide assumes the S3 bucket is named develocity
, but you may choose a different name.
You do not need to store your Build Scan data in S3 to export the build models to S3. |
Enable build model export
Next, enable the feature that exports the build models to S3:
-
Navigate to
-
Add
feature.buildModelExport=ENABLED
in Config parameters -
Select Save
-
Follow the instructions to restart the Develocity server
See Appendix B for enabling the build model export feature in the Helm chart and when using background processors.
Enabling the build model export feature will require increased memory. Reach out to your technical contact from Gradle for assistance setting appropriate values.
Pay attention to the number of exported build models. It will grow indefinitely, causing increased S3 storage costs and Athena costs for all build models included in queries unless you set up some expiration policies in AWS S3. |
Configure AWS Glue
Athena uses Glue to store the table schema. To create a database inside the data catalog, follow the Glue instructions at Working with databases on the AWS Glue console.
This guide assumes the created database is named develocity-db
, but you may choose a different name.
Configure AWS Athena
Create an Athena workgroup and table for the build models from Develocity. You will need the AmazonAthenaFullAccess managed policy to perform these tasks.
Create an AWS Athena workgroup
An Athena workgroup is required to define the parameters for running queries against the build models. To create an Athena workgroup, follow the Athena instructions at Create a workgroup.
In step 4, do the following:
-
Choose Athena SQL as the engine type.
-
Specify a new folder in your S3 bucket to store Athena’s query results in Location of query result on the Query result configuration pane.
This guide assumes the query results folder is named query-results
in a S3 bucket named develocity
and the created workgroup is named develocity-wg
, but you may choose different names.
Pay attention to AWS’ suggestions for lifecycle configuration. Resource usage and costs may increase significantly if this is not configured. |
See Using workgroups for running queries for how to configure more complex scenarios.
Create an AWS Athena table
Create the table that describes the schema of the build models:
-
Select the database you created in Glue previously.
-
In the Query Editor, select Create in Tables and Views, then select Create Table.
-
Enter the DDL statement for your version of Develocity from Appendix C.
-
Edit the
LOCATION
parameter and thestorage.location.template
table property to point to the S3 bucket available to Develocity. You can find more information about these properties in the Athena documentation at partition projection.LOCATION '«BUCKET»/build-models'(1) TBLPROPERTIES( 'projection.enabled'='true', 'projection.startdate.format'='yyyy-MM-dd', 'projection.startdate.range'='2020-01-01,NOW',(2) 'projection.startdate.type'='date', 'storage.location.template'='«BUCKET»/build-models/startdate=${startdate}/'(3) )
-
Insert the full S3 URI here. For example
's3://develocity/build-models'
. -
Adjust the range to match the earliest date of your build models. You can also use a relative time range matching your Build Scan retention like
'projection.startdate.range'='NOW-90DAYS,NOW'
. -
Insert the full S3 URI here. For example:
'storage.location.template'='s3://develocity/build-models/startdate=${startdate}/'
.
-
-
Select Run.
This guide assumes the table is named build
. It is recommended that this naming is used during the installation for ease of later integration, but you may choose a different name in the DDL statement.
Pay attention to the required suffix /startdate=${startdate}/ (including the trailing slash) in the storage.location.template table property. |
See Creating tables using AWS Glue or the Athena console for more details.
Verify the AWS Athena setup
Run a simple query against the build
table to verify that the table is properly set up and connected to S3.
-
In the Query Editor, select the database you created in Glue and issue a simple build model query:
SELECT COUNT(*) AS "Build Count last 7 days" FROM build WHERE startdate BETWEEN CURRENT_DATE - INTERVAL '7' DAY AND CURRENT_DATE
The startdate
column is a partition key in the table, and queries that filter on this column will be more efficient. -
The results are displayed in a table below the Query Editor:
Configure Grafana
Create a data source for the database and table referencing the build models and create a dashboard that displays build data by querying this data source.
Connect to AWS Athena
Add a data source pointing to your Glue database and Athena table. To create a data source, follow the Grafana instructions at Configure the data source in Grafana.
In step 2, do the following:
-
Enter your AWS credentials and default region in Connection Details (see the Athena plugin for Grafana documentation for the different authentication options).
-
Select AWSDataCatalog from Data source.
-
Select the database you created in Glue from Database.
-
Select the workgroup you created in Athena from Workgroup.
-
Leave Output Location empty to use your workgroup’s default output location.
-
Click Save & test.
If you use Grafana Cloud and choose to provide authorization using the AWS Assume Role feature, note that this feature is in private preview. You must contact technical support at Grafana to enable it for your stack. |
See Grafana data sources for more information.
Create a dashboard
Create a Grafana dashboard that displays Develocity build data in a chart. To create a dashboard, follow the Grafana instructions at Create a dashboard.
In step four, select the Grafana data source that you created from Data source previously.
The following image shows a bar chart visualization of the number of builds grouped by day using the following query:
SELECT count(DISTINCT id), startdate
FROM build
WHERE $__dateFilter(startdate) (1)
GROUP BY startdate
1 | The __dateFilter macro is provided by the Athena data source and is used to inject the time range that is currently selected in the Grafana dashboard as a filter criterion of the query. |
See Grafana dashboards for more information.
Import the Gradle-provided dashboards
Gradle provides a set of pre-defined, JSON-based Develocity dashboard definitions. Reach out to your technical contact from Gradle to get assistance integrating them into your on-premise or cloud Grafana instance.
The dashboard bundle can be downloaded from Dashboard downloads.
The required steps to import the dashboards provided by Gradle differ depending on the type of Grafana installation. Besides the manual import of dashboards and the simple built-in provisioning mechanism, Grafana also provides a variety of Infrastructure as Code options to manage dashboards.
Compatibility
Grafana
The Gradle-provided dashboards have been tested against Grafana 10.2.3 and are likely to work with newer versions of Grafana. We do not recommend using earlier versions of Grafana as some dashboards will not be displayed properly.
Build environment
Most of the Gradle-provided dashboards rely on the presence of specific tags and custom values on Build Scans to render the desired visualizations.
Tags used within the dashboards:
-
CI
- Used throughout the dashboards to classify builds as either CI (tag is present) or local (tag is absent) builds
Custom values used within the dashboards:
-
Git repository
- The URI of the Git repository that the build was run in, for example,,git@github.com:gradle/gradle.git
-
CI provider
- The name of the CI provider that ran the build, for example,GitHub Actions
If you are using the Common Custom User Data Gradle plugin or Common Custom User Data Maven extension, this information will be automatically added to every Build Scan. |
Prerequisites
Upon extracting the Gradle provided dashboard zip bundle, you will find the following directory structure:
/
├─ kustomization.yaml (1)
├─ dashboards/ (2)
│ ├─ 00 - Overview/
│ │ ├─ global-volume.json
│ │ ├─ ...
│ ├─ 10 - Environment/
│ │ ├─ build-tools.json
│ │ ├─ ...
│ ├─ ...
├─ athena-views/
│ ├─ views.sql (3)
1 | Kustomization file to import the dashboards into a Kubernetes cluster. |
2 | Directory containing the JSON files of the dashboards, organized by category. |
3 | View Definitions for Athena. |
Throughout the following steps, all shell commands are expected to be run in the root directory of the extracted dashboard bundle. |
The Gradle provided dashboards query views of the builds
table, rather than querying the builds
table itself. For the queries in the Gradle provided dashboards to work, you are therefore required to create the following views in your Athena database:
-
build_summary
-
gradle_task_summary
-
maven_goal_summary
-
task_mapping
-
unit_execution_summary
-
plugin_summary
-
test_performance_summary
The DDL for the views is contained in athena-views/views.sql
. You can run the DDL script in your Athena database using the Athena console similar to how the creation of the build
table is described in Create an AWS Athena table.
These views rely on your Athena table being named builds . If you named your table differently, you will need to adjust the view definitions accordingly. |
You will also need to insert the UID
of your Grafana Athena datasource into the dashboard JSON files. You can use the following command to globally insert the UID into all JSON files:
$ find ./dashboards -type f -name '*.json' -exec sed -i.bak 's/athena-datasource-uid-placeholder/YOUR-DATASOURCE-UID/g' {} \; \
&& find ./dashboards -type f -name '*.json.bak' -exec rm {} \;
You can find your datasource’s UID in the Grafana UI by navigating to Connections → Data Sources and clicking on the Athena datasource. The UID is displayed in the URL. |
Grafana Installation Types
Using Grafana’s built-in provisioning mechanism
Use if you have access to the file system of the Grafana server.
Grafana has a provisioning mechanism that allows you to point it to a directory containing dashboard definitions in JSON format. Grafana will then watch that directory for changes and automatically import new dashboards or update existing ones.
Steps:
-
Replace the datasource UID placeholder in the dashboard files with the Athena datasource UID from your installation as described in prerequisites.
-
Copy the contents of the
dashboards
folder from the provided zip file within into a directory on your Grafana server. This guide assumes you copied the contents into/var/lib/grafana/dashboards
, so that you end up with the following directory structure:/var/lib/grafana/dashboards/ ├─ 00 - Overview/ │ ├─ global-volume.json │ ├─ project-volume.json ├─ 10 - Environment/ │ ├─ build-tools.json │ ├─ ... ├─ ...
-
Place a
dashboards.yaml
file with the following content into Grafana’s provisioning directory at/etc/grafana/provisioning/dashboards/
.apiVersion: 1 providers: - name: 'Gradle Develocity Dashboards' type: file allowUiUpdates: false disableDeletion: true updateIntervalSeconds: 60 options: path: /var/lib/grafana/dashboards foldersFromFilesStructure: true
Using Grafana’s provisioning mechanism with Helm and Kubernetes
Use if you are self-hosting Grafana in a Kubernetes cluster, and you manage your Grafana deployment with Helm.
The Grafana Helm chart contains an option which allows Grafana to discover and collect dashboards stored in ConfigMaps within the cluster.
Steps:
-
Replace the datasource UID placeholder in the dashboard files with the Athena datasource UID from your installation as described in prerequisites.
-
Place the dashboards into ConfigMaps having the label
grafana_dashboard: "1"
. Grafana recommends creating one ConfigMap per dashboard. You can create ConfigMaps individually per dashboard using the followingkubectl
command:$ kubectl create configmap overview-global-volume-dashboard \ -n <YOUR-NAMESPACE> \ --from-file='./dashboards/00 - Overview/global-volume.json' \ --dry-run=client \ -o yaml \ | kubectl label -f- --dry-run=client -o yaml --local grafana_dashboard=1 \ | kubectl annotate -f- --dry-run=client -o yaml --local 'grafana_folder=/tmp/dashboards/00 - Overview' \ | kubectl apply --server-side=true -f -
To help you with importing all our dashboards, preserving the recommended folder structure, we have prepared a kustomize file along with the dashboard bundle. It can be applied with this command:
$ kubectl kustomize ./ | kubectl apply -n <YOUR-NAMESPACE> --server-side=true -f -
-
Enable and configure the
dashboards
sidecar container in Grafana’svalues.yaml
file:sidecar: dashboards: enabled: true label: grafana_dashboard label_value: '1' folderAnnotation: grafana_folder provider: name: 'Gradle Develocity Dashboards' type: file allowUiUpdates: false disableDeletion: true foldersFromFilesStructure: true
Refer to Grafana’s Helm installation guide and the Helm Chart documentation for more information.
Manually importing dashboards via the Grafana UI
Use if you only need a small subset of the provided dashboards, and you can not use any of the more automated methods described above.
You need to have the Editor role in Grafana to import dashboards via the UI. |
Steps
-
Replace the datasource UID placeholder in the dashboard files with the Athena datasource UID from your installation as described in prerequisites.
-
Go to your Grafana’s dashboard view at
https://your-grafana-instance.com/dashboards
. -
Optional: Create a new folder for the Gradle dashboards by clicking on the
New
button and selectingCreate Folder
from the dropdown. -
Click the
New
button and selectImport
. -
Drag and drop the JSON file of the dashboard you want to import into the drop zone or click on the drop zone to open a file dialog, then click
Load
. -
Confirm the name of the dashboard and the folder you want to import it into and click
Import
.
Using Amazon managed Grafana with Terraform
Use if you want to use Amazon Managed Grafana, and you manage your infrastructure with Terraform.
This only deploys the Amazon Managed Grafana instance and imports the dashboards. This requires an already working Athena setup to succeed. Amazon Managed Grafana uses either AWS IAM Identity Center or SAML. The provided terraform scripts only support IAM Identity Center. If you want to use SAML adjust the terraform modules in |
As part of our dashboard bundle, we provide a set of terraform modules and scripts alongside it to create and populate an Amazon Managed Grafana instance along with the required permissions, buckets and other ancillary resources. These modules are placed in the terraform/modules
directory, and can be used to integrate into your existing infrastructure as code pipeline.
In the following steps we provide a set of shell scripts to execute a deployment of the Amazon Managed Grafana instance. These scripts are provided as a convenience and can be used as a reference to integrate the terraform modules into your existing infrastructure as code pipeline. |
Steps
-
Make sure you have the AWS CLI installed and configured with the necessary permissions.
-
Make sure you have Terraform installed.
-
Unzip the provided dashboard bundle and navigate to the
terraform
directory. Every subsequent step should be executed in this directory. -
In the
terraform
directory there are three shell scripts that are used in the subsequent steps to deploy the Amazon Managed Grafana instance: -
provision-grafana.sh
: to deploy the Amazon Managed Grafana instance and the resources it requires. -
provision-grafana-datasources.sh
: to install the Athena plugin and create the Athena datasource. -
provision-drv-dashboards.sh
: to import the dashboards.
Prepare your environment
$ unzip gradle-dashboards-bundle-X-X-X.zip
Change your working directory to the unzipped dashboard bundle:
$ cd terraform
Provision the Amazon managed Grafana instance
The provided modules assumes using AWS Identity Center to manage the users and groups that have access to the Amazon Managed Grafana instance. In this case a set of user/group SSO IDs are required to set up the Grafana instance, these IDs are NOT the same as AWS user ids. They need to be retrieved from the AWS SSO console. Please contact your AWS SSO administrator to get these IDs. |
Run the provision-grafana.sh
script to deploy the Amazon Managed Grafana instance and the resources it requires.
$ provision-grafana.sh \
--bucket-name develocity \(1)
--aws-region eu-west-1 \(2)
--aws-profile my-profile \(3)
--develocity-drv-grafana-admin-user-ids my-sso-id \(4)
--develocity-drv-grafana-admin-group-ids my-sso-id \(4)
--prefix develocity \(5)
--apply \ (6)
--destroy (6)
1 | The name of the bucket where your build models are stored. This is normally the same as the bucket as the one configured for Develocity. |
2 | The region where you want to deploy the Amazon Managed Grafana instance. |
3 | The name of the AWS CLI profile you want to use to deploy the resources. |
4 | The SSO id’s of the groups or users you want to have admin access to the Grafana instance. This is a repeatable argument. At least one of these arguments must be provided. |
5 | An optional prefix to use for the resources created by the script. This is useful to avoid naming conflicts when deploying to the same AWS account. |
6 | To apply or destroy the resources respectively. If neither is provided the script will default to creating a terraform plan. In either of these cases you will still need to interactively approve the plan before application. |
Install the Athena plugin and create the Athena datasource
Once the Amazon Managed Grafana instance is deployed you can then install the Athena plugin and create the Athena datasource by running provision-grafana-datasources.sh
:
$ provision-grafana-datasources.sh \
--db-name develocity-db \(1)
--grafana-URL https://my-grafana-URL.com \(2)
--grafana-token my-grafana-token \(3)
--workgroup develocity-wg \(4)
--aws-region eu-west-1 \(5)
--apply \(6)
--destroy (6)
1 | The name of the Glue database where the Athena tables were created. |
2 | Optional parameter, normally provided by the previous script automatically The URL of the Amazon Managed Grafana instance. |
3 | Optional parameter, normally provided by the previous script automatically The API token for the Amazon Managed Grafana instance. |
4 | Optional parameter, normally provided by the previous script automatically The name of the Athena workgroup. |
5 | Optional parameter, normally provided by the previous script automatically The region where the Amazon Managed Grafana instance was deployed. |
6 | To apply or destroy the resources respectively. If neither is provided the script will default to creating a terraform plan. In both cases you will still need to interactively approve the plan before application. |
Import the provided dashboards
Finally, you can import the dashboards to the provisioned Grafana instance by running provision-drv-dashboards.sh
:
$ provision-drv-dashboards.sh \
--dashboard-folder-path ../dashboards \(1)
--grafana-URL https://my-grafana-URL.com \(2)
--grafana-token my-grafana-token \(3)
--datasource-id my-datasource-id \(4)
--apply \(5)
--destroy (5)
1 | The path to the directory containing the dashboard JSON files. If you are using the provided dashboard bundle this will be ../dashboards . |
2 | Optional parameter, normally provided by the previous script automatically The URL of the Amazon Managed Grafana instance. |
3 | Optional parameter, normally provided by the previous script automatically The API token for the Amazon Managed Grafana instance. |
4 | Optional parameter, normally provided by the previous script automatically The id of the datasource created in the previous stage. |
5 | To apply or destroy the resources respectively. If none is provided the script will default to create a terraform plan. In either of these cases you will still need to interactively approve the plan before application. |
For security reasons, the lifetime of the Grafana token that is used to configure the Amazon Managed Grafana instance is limited. If the token expires, you will receive |
If the script executions fail for some reason, the commands terraform output stages/stage_name
can be used to get the outputs of the executed stage. This should provide useful information to investigate the failures, alongside with the Grafana URL and the administrator token.
Deploying an Amazon Managed Grafana instance requires many permissions, because it is a complex operation involving several resource types. The list of permissions and AWS managed policies that need to be attached to the user/role executing the terraform scripts is shown below:
-
Permissions required for managing IAM resources
-
These permissions are required to be able to manage the role that can be assumed by the managed Grafana instance, and has access to the bucket storing the build-model data, used by Athena.
-
iam:ListRoles
-
iam:CreateRole
-
iam:GetRole
-
iam:DeleteRole
-
iam:ListRolePolicies
-
iam:ListAttachedRolePolicies
-
iam:ListInstanceProfilesForRole
-
iam:AttachRolePolicy
-
iam:PutRolePolicy
-
iam:GetRolePolicy
-
iam:DeleteRolePolicy
-
iam:DetachRolePolicy
-
-
AWS managed policy required to manage S3 connections
-
This policy is required to be able to create and manage a bucket that will store the query results executed by Athena.
-
AmazonS3FullAccess
-
-
AWS managed policy required to manage Athena resources
-
This policy is required to be able to create and manage the Athena workgroup and the Athena database.
-
AmazonAthenaFullAccess
-
-
AWS managed policy required to administer the Amazon Managed Grafana instance
-
This policy is required to be able to create and manage the Amazon Managed Grafana instance.
-
AWSGrafanaAccountAdministrator
-
-
AWS managed policies to manage the SSO settings
-
These policies are required to be able to manage the SSO settings for the Amazon Managed Grafana instance.
-
AWSOrganizationsFullAccess
-
AWSSSODirectoryAdministrator
-
AWSSSOMasterAccountAdministrator
-
Upgrading
Upgrading Athena tables
When upgrading Develocity, the schema of the build models may change, also new data may be available to be consumed via the dashboards. To upgrade the Athena tables to the new schema, follow these steps:
Our Athena tables are created by reading data from S3, and they do not have the capabilities to remove any data from them. Because of this dropping the tables will not result in any data loss. |
-
Go to your AWS Console.
-
For each of the tables and views that were created perform one of the following steps.
-
Click the three dots next to the table name, and click Delete table.
-
Run the command
DROP TABLE 'table_name'
for tables, andDROP VIEW 'view_name'
for views in the query editor
-
-
Download the latest dashboard bundle and follow the installation procedure as previously
Upgrading Grafana dashboards
When upgrading Develocity, as new data becomes available, new dashboards may be created to surface more insights. To upgrade the Grafana dashboards, follow these steps:
This will delete every dashboard on your Grafana, if you have created new ones, or have imported the dashboards to a Grafana that has other dashboard, make sure to only remove the dashboards supplied by Gradle. |
The process of deploying the new set of dashboards, depends on how it was originally installed/provisioned, therefore please follow the appropriate instructions for your Grafana installation type.
Upgrade dashboards provisioned using Grafana’s build-in provisioning
If you have followed the procedure of importing the dashboards via Grafana provisioning, you can follow these steps to upgrade the dashboards:
-
Open your Grafana provisioning folder.
-
Delete all folders and files you want to update within the folder.
-
In a docker desktop environment, this may cause Docker to stop syncing changes, so you may need to restart the container, or delete the files one by one, or enable
File Sharing
in your Docker Desktop settings.
-
-
Open your Grafana and log in with a user account with edit privileges.
-
Open the Dashboards view, and select the checkbox next to the
Name
header to select all folders. -
Click Delete, this will remove all dangling folders from the Grafana UI that cannot be automatically unprovisioned.
-
Download the latest dashboard bundle.
-
Replace the datasource UID placeholder in the dashboard files with the Athena datasource UID from your installation as described in prerequisites.
-
Copy the contents of the dashboards folder from the provided zip file within into a directory on your Grafana server.
-
Restart your Grafana instance to apply the changes.
Upgrade dashboards provisioned using Grafana’s provisioning mechanism with Helm and Kubernetes
If you have followed the procedure of importing the dashboards via Grafana provisioning with Helm and Kubernetes, you can follow these steps to upgrade the dashboards:
-
Delete the ConfigMaps containing the dashboards.
$ kubectl --namespace <YOUR-NAMESPACE> delete configmap -l grafana_dashboard=1
-
Open your Grafana and log in with a user account with edit privileges.
-
Open the Dashboards view, and click the checkbox next to the
Name
header to select all folders. -
Click Delete.
-
Download the latest dashboard bundle.
-
Replace the datasource UID placeholder in the dashboard files with the Athena datasource UID from your installation as described in prerequisites.
-
Run the provided kustomize file to re-import the dashboards.
$ kubectl kustomize ./ | kubectl apply -n <YOUR-NAMESPACE> --server-side=true -f -
Upgrading manually imported dashboards
If you have followed the procedure of manually importing the dashboards, you can follow these steps to upgrade the dashboards:
-
Open your Grafana and log in with a user account with edit privileges.
-
Open the Dashboards view, and click the checkbox next to the
Name
header to select all folders. -
Click Delete.
-
Download the latest dashboard bundle and follow the installation procedure as previously.
Appendix
Appendix A: Build model schema DDL statements
Use the DDL statement corresponding to your version of Develocity to create the Athena table that describes the schema of the build models. Modify the LOCATION
property and the storage.location.template
table property to point to the S3 bucket available to Develocity.
Upgrading AWS managed Grafana installed with Terraform
If you have followed the procedure of installing Amazon Managed Grafana with Terraform, you can follow these steps to upgrade the dashboards:
-
Download the latest dashboard bundle and follow the installation procedure as previously using the path of the newly downloaded dashboards.
Appendix B: S3 object storage enablement
The following configuration block of the values.yaml
Helm chart demonstrates the minimal configuration to enable object storage in S3. This cannot be configured in the Develocity Administration UI.
objectStorage:
type: s3
s3:
bucket: develocity
region: «aws-region»
credentials:
source: configuration
accessKey: "«aws-access-key»"
secretKey: "«aws-secret-key»"
Optionally, modify the bucket value to point to the S3 bucket available to Develocity.
See the “Amazon S3” section in the Standalone Helm Chart Configuration Guide or the Kubernetes Helm Chart Configuration Guide for alternative options to configure credentials.
Appendix C: Build model export enablement
The following configuration block of the values.yaml
Helm chart demonstrates the minimal configuration to enable the build model export feature without using the Develocity Administration UI.
global:
unattended:
configuration:
version: 9
systemPassword: «hashed-system-password»
advanced:
#configure here when not using a separate bg processor
app:
params:
feature.buildModelExport: ENABLED
#configure here when using a separate bg processor
#appBackgroundProcessor:
# params:
# feature.buildModelExport: ENABLED
See the “Unattended configuration” section in the Develocity Administration Manual for Helm Installations for more information.
Develocity 2024.2.x
CREATE EXTERNAL TABLE `build`(
`type` STRING,
`id` STRING,
`buildToolVersion` STRING,
`buildAgentVersion` STRING,
`modelVersion` STRUCT<`year`: INT, `string`: STRING, `release`: INT, `patch`: INT>,
`buildStartTime` DATE,
`gradleAttributes` STRUCT<`id`: STRING, `hasFailed`: BOOLEAN, `tags`: ARRAY<STRING>, `links`: ARRAY<STRUCT<`label`: STRING, `url`: STRING>>, `environment`: STRUCT<`username`: STRING, `jreVersion`: STRING, `jvmVersion`: STRING, `jvmCharset`: STRING, `jvmLocale`: STRING, `operatingSystem`: STRING, `numberOfCpuCores`: INT, `publicHostname`: STRING, `localHostname`: STRING, `localIpAddresses`: ARRAY<STRING>, `jvmMaxMemoryHeapSize`: BIGINT>, `values`: ARRAY<STRUCT<`name`: STRING, `value`: STRING>>, `rootProjectName`: STRING, `requestedTasks`: ARRAY<STRING>, `develocitySettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `fileFingerprintCapturingEnabled`: BOOLEAN, `resourceUsageCapturingEnabled`: BOOLEAN, `taskInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `buildOptions`: STRUCT<`buildCacheEnabled`: BOOLEAN, `daemonEnabled`: BOOLEAN, `dryRunEnabled`: BOOLEAN, `excludedTasks`: ARRAY<STRING>, `offlineModeEnabled`: BOOLEAN, `rerunTasksEnabled`: BOOLEAN, `configurationOnDemandEnabled`: BOOLEAN, `parallelProjectExecutionEnabled`: BOOLEAN, `configurationCacheEnabled`: BOOLEAN, `continuousBuildEnabled`: BOOLEAN, `continueOnFailureEnabled`: BOOLEAN, `fileSystemWatchingEnabled`: BOOLEAN, `isolatedProjectsEnabled`: BOOLEAN, `maxNumberOfGradleWorkers`: INT, `refreshDependenciesEnabled`: BOOLEAN>, `buildStartTime`: BIGINT, `buildDuration`: BIGINT, `gradleVersion`: STRING, `pluginVersion`: STRING, `hasVerificationFailure`: BOOLEAN, `hasNonVerificationFailure`: BOOLEAN, `gradleEnterpriseSettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `resourceUsageCapturingEnabled`: BOOLEAN, `taskInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>>,
`mavenAttributes` STRUCT<`id`: STRING, `hasFailed`: BOOLEAN, `tags`: ARRAY<STRING>, `links`: ARRAY<STRUCT<`label`: STRING, `url`: STRING>>, `environment`: STRUCT<`username`: STRING, `jreVersion`: STRING, `jvmVersion`: STRING, `jvmCharset`: STRING, `jvmLocale`: STRING, `operatingSystem`: STRING, `numberOfCpuCores`: INT, `publicHostname`: STRING, `localHostname`: STRING, `localIpAddresses`: ARRAY<STRING>, `jvmMaxMemoryHeapSize`: BIGINT>, `values`: ARRAY<STRUCT<`name`: STRING, `value`: STRING>>, `develocitySettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `fileFingerprintCapturingEnabled`: BOOLEAN, `resourceUsageCapturingEnabled`: BOOLEAN, `goalInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `buildOptions`: STRUCT<`rerunGoals`: BOOLEAN, `offlineModeEnabled`: BOOLEAN, `batchModeEnabled`: BOOLEAN, `debugEnabled`: BOOLEAN, `errorsEnabled`: BOOLEAN, `failAtEndEnabled`: BOOLEAN, `failFastEnabled`: BOOLEAN, `failNeverEnabled`: BOOLEAN, `laxChecksumsEnabled`: BOOLEAN, `maxNumberOfThreads`: INT, `nonRecursiveEnabled`: BOOLEAN, `rerunGoalsEnabled`: BOOLEAN, `quietEnabled`: BOOLEAN, `noSnapshotsUpdatesEnabled`: BOOLEAN, `strictChecksumsEnabled`: BOOLEAN, `updateSnapshotsEnabled`: BOOLEAN>, `buildStartTime`: BIGINT, `buildDuration`: BIGINT, `mavenVersion`: STRING, `extensionVersion`: STRING, `topLevelProjectName`: STRING, `requestedGoals`: ARRAY<STRING>, `hasVerificationFailure`: BOOLEAN, `hasNonVerificationFailure`: BOOLEAN, `gradleEnterpriseSettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `resourceUsageCapturingEnabled`: BOOLEAN, `goalInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>>,
`gradleBuildCachePerformance` STRUCT<`id`: STRING, `buildTime`: BIGINT, `buildCaches`: STRUCT<`local`: STRUCT<`isEnabled`: BOOLEAN, `directory`: STRING, `isPushEnabled`: BOOLEAN, `isDisabledDueToError`: BOOLEAN>, `remote`: STRUCT<`type`: STRING, `className`: STRING, `isEnabled`: BOOLEAN, `url`: STRING, `isPushEnabled`: BOOLEAN, `isDisabledDueToError`: BOOLEAN>, `overhead`: STRUCT<`uploading`: BIGINT, `downloading`: BIGINT, `packing`: BIGINT, `unpacking`: BIGINT>>, `serializationFactor`: DOUBLE, `taskExecution`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `taskPath`: STRING, `taskType`: STRING, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `skipReasonMessage`: STRING, `cacheArtifactSize`: BIGINT, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT>>, `effectiveWorkUnitExecutionTime`: BIGINT, `workUnitFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>, `workUnitAvoidanceSavingsSummary`: STRUCT<`total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT, `localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT>, `effectiveTaskExecutionTime`: BIGINT, `serialTaskExecutionTime`: BIGINT, `serialWorkUnitExecutionTime`: BIGINT, `taskFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>, `avoidanceSavingsSummary`: STRUCT<`total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT, `localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT>, `taskAvoidanceSavingsSummary`: STRUCT<`total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT, `localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT>>,
`mavenBuildCachePerformance` STRUCT<`id`: STRING, `buildTime`: BIGINT, `buildCaches`: STRUCT<`local`: STRUCT<`isEnabled`: BOOLEAN, `directory`: STRING, `isPushEnabled`: BOOLEAN, `isDisabledDueToError`: BOOLEAN>, `remote`: STRUCT<`isEnabled`: BOOLEAN, `url`: STRING, `isPushEnabled`: BOOLEAN, `isDisabledDueToError`: BOOLEAN>, `overhead`: STRUCT<`uploading`: BIGINT, `downloading`: BIGINT, `packing`: BIGINT, `unpacking`: BIGINT>>, `serializationFactor`: DOUBLE, `goalExecution`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `goalName`: STRING, `mojoType`: STRING, `goalExecutionId`: STRING, `goalProjectName`: STRING, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `cacheArtifactSize`: BIGINT, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT>>, `effectiveProjectExecutionTime`: BIGINT, `avoidanceSavingsSummary`: STRUCT<`total`: BIGINT, `ratio`: DOUBLE, `localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT>, `serialProjectExecutionTime`: BIGINT, `goalFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>>,
`gradleBuildProfileOverview` STRUCT<`memoryUsage`: STRUCT<`memoryPools`: ARRAY<STRUCT<`name`: STRING, `peakMemory`: BIGINT, `maxMemory`: BIGINT>>, `totalGarbageCollectionTime`: BIGINT>, `breakdown`: STRUCT<`execution`: BIGINT, `total`: BIGINT, `endOfBuild`: BIGINT, `initialization`: BIGINT, `configuration`: BIGINT>>,
`mavenBuildProfileOverview` STRUCT<`memoryUsage`: STRUCT<`memoryPools`: ARRAY<STRUCT<`name`: STRING, `peakMemory`: BIGINT, `maxMemory`: BIGINT>>, `totalGarbageCollectionTime`: BIGINT>, `breakdown`: STRUCT<`execution`: STRUCT<`total`: BIGINT, `endOfBuild`: BIGINT, `goalExecution`: BIGINT>, `total`: BIGINT, `initializationAndDiscovery`: STRUCT<`settings`: BIGINT, `toolchains`: BIGINT, `other`: BIGINT, `total`: BIGINT, `projectDiscovery`: BIGINT>>>,
`gradleProjects` ARRAY<STRUCT<`name`: STRING, `parent`: INT, `path`: STRING>>,
`mavenModules` ARRAY<STRUCT<`name`: STRING, `parent`: INT, `version`: STRING, `groupId`: STRING, `artifactId`: STRING>>,
`mavenDependencyResolution` STRUCT<`networkRequestCount`: BIGINT, `fileDownloadSize`: BIGINT, `fileDownloadCount`: BIGINT, `serialDependencyResolutionTime`: BIGINT, `serialNetworkRequestTime`: BIGINT, `wallClockNetworkRequestTime`: BIGINT>,
`gradleNetworkActivity` STRUCT<`networkRequestCount`: BIGINT, `fileDownloadSize`: BIGINT, `fileDownloadCount`: BIGINT, `serialNetworkRequestTime`: BIGINT, `wallClockNetworkRequestTime`: BIGINT>,
`gradleArtifactTransformExecutions` STRUCT<`artifactTransformExecutions`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `outcome`: STRING, `transformActionType`: STRING, `inputArtifactName`: STRING, `changedAttributes`: ARRAY<STRUCT<`name`: STRING, `from`: STRING, `to`: STRING>>, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `skipReasonMessage`: STRING, `cacheArtifactSize`: BIGINT, `artifactTransformExecutionName`: STRING, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT>>>,
`gradleDeprecations` STRUCT<`deprecations`: ARRAY<STRUCT<`summary`: STRING, `advice`: STRING, `usages`: ARRAY<STRUCT<`owner`: STRUCT<`location`: STRING, `type`: STRING>, `contextualAdvice`: STRING>>, `removalDetails`: STRING, `documentationUrl`: STRING>>>,
`gradlePlugins` STRUCT<`plugins`: ARRAY<STRUCT<`id`: STRING, `className`: STRING, `version`: STRING, `projects`: ARRAY<STRING>>>>,
`mavenPlugins` STRUCT<`plugins`: ARRAY<STRUCT<`name`: STRING, `version`: STRING, `modules`: ARRAY<STRING>, `goalPrefix`: STRING, `groupId`: STRING, `artifactId`: STRING, `executedGoals`: ARRAY<STRING>, `requiredMavenVersion`: STRING>>>,
`gradleResourceUsage` STRUCT<`totalMemory`: BIGINT, `execution`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>, `total`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>, `nonExecution`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>>,
`mavenResourceUsage` STRUCT<`totalMemory`: BIGINT, `execution`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>, `total`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>, `nonExecution`: STRUCT<`networkUploadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `allProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildProcessMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskReadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `diskWriteThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesCpu`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `buildChildProcessesMemory`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>, `networkDownloadThroughput`: STRUCT<`average`: BIGINT, `max`: BIGINT, `median`: BIGINT, `p5`: BIGINT, `p25`: BIGINT, `p75`: BIGINT, `p95`: BIGINT>>>,
`gradleConfigurationCache` STRUCT<`result`: STRUCT<`outcome`: STRING, `entrySize`: BIGINT, `store`: STRUCT<`duration`: BIGINT, `hasFailed`: BOOLEAN>, `load`: STRUCT<`duration`: BIGINT, `hasFailed`: BOOLEAN>, `missReasons`: ARRAY<STRING>>>
)
PARTITIONED BY (
`startdate` date COMMENT ''
)
ROW FORMAT SERDE
'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES (
'ignore.malformed.json'='true'
)
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'«BUCKET»/build-models'
TBLPROPERTIES (
'projection.enabled'='true',
'projection.startdate.format'='yyyy-MM-dd',
'projection.startdate.range'='2020-01-01,NOW',
'projection.startdate.type'='date',
'storage.location.template'='«BUCKET»/build-models/startdate=${startdate}/'
)
Develocity 2024.1.x
CREATE EXTERNAL TABLE `build`(
`type` STRING,
`id` STRING,
`buildToolVersion` STRING,
`buildAgentVersion` STRING,
`modelVersion` STRUCT<`year`: INT, `string`: STRING, `release`: INT, `patch`: INT>,
`buildStartTime` DATE,
`gradleAttributes` STRUCT<`environment`: STRUCT<`jvmMaxMemoryHeapSize`: BIGINT, `localHostname`: STRING, `operatingSystem`: STRING, `numberOfCpuCores`: INT, `publicHostname`: STRING, `localIpAddresses`: ARRAY<STRING>, `username`: STRING, `jreVersion`: STRING, `jvmVersion`: STRING, `jvmCharset`: STRING, `jvmLocale`: STRING>, `id`: STRING, `hasVerificationFailure`: BOOLEAN, `hasNonVerificationFailure`: BOOLEAN, `gradleEnterpriseSettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `taskInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `buildStartTime`: BIGINT, `buildDuration`: BIGINT, `gradleVersion`: STRING, `pluginVersion`: STRING, `rootProjectName`: STRING, `requestedTasks`: ARRAY<STRING>, `develocitySettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `taskInputsFileCapturingEnabled`: BOOLEAN, `fileFingerprintCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `buildOptions`: STRUCT<`configurationOnDemandEnabled`: BOOLEAN, `parallelProjectExecutionEnabled`: BOOLEAN, `configurationCacheEnabled`: BOOLEAN, `continuousBuildEnabled`: BOOLEAN, `continueOnFailureEnabled`: BOOLEAN, `fileSystemWatchingEnabled`: BOOLEAN, `maxNumberOfGradleWorkers`: INT, `refreshDependenciesEnabled`: BOOLEAN, `buildCacheEnabled`: BOOLEAN, `daemonEnabled`: BOOLEAN, `dryRunEnabled`: BOOLEAN, `excludedTasks`: ARRAY<STRING>, `offlineModeEnabled`: BOOLEAN, `rerunTasksEnabled`: BOOLEAN>, `hasFailed`: BOOLEAN, `tags`: ARRAY<STRING>, `values`: ARRAY<STRUCT<`name`: STRING, `value`: STRING>>, `links`: ARRAY<STRUCT<`label`: STRING, `URL`: STRING>>>,
`mavenAttributes` STRUCT<`environment`: STRUCT<`jvmMaxMemoryHeapSize`: BIGINT, `localHostname`: STRING, `operatingSystem`: STRING, `numberOfCpuCores`: INT, `publicHostname`: STRING, `localIpAddresses`: ARRAY<STRING>, `username`: STRING, `jreVersion`: STRING, `jvmVersion`: STRING, `jvmCharset`: STRING, `jvmLocale`: STRING>, `id`: STRING, `hasVerificationFailure`: BOOLEAN, `hasNonVerificationFailure`: BOOLEAN, `gradleEnterpriseSettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `goalInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `mavenVersion`: STRING, `extensionVersion`: STRING, `topLevelProjectName`: STRING, `requestedGoals`: ARRAY<STRING>, `buildStartTime`: BIGINT, `buildDuration`: BIGINT, `develocitySettings`: STRUCT<`backgroundPublicationEnabled`: BOOLEAN, `fileFingerprintCapturingEnabled`: BOOLEAN, `goalInputsFileCapturingEnabled`: BOOLEAN, `buildOutputCapturingEnabled`: BOOLEAN, `testOutputCapturingEnabled`: BOOLEAN>, `buildOptions`: STRUCT<`noSnapshotsUpdatesEnabled`: BOOLEAN, `strictChecksumsEnabled`: BOOLEAN, `updateSnapshotsEnabled`: BOOLEAN, `batchModeEnabled`: BOOLEAN, `debugEnabled`: BOOLEAN, `errorsEnabled`: BOOLEAN, `failAtEndEnabled`: BOOLEAN, `failFastEnabled`: BOOLEAN, `failNeverEnabled`: BOOLEAN, `laxChecksumsEnabled`: BOOLEAN, `maxNumberOfThreads`: INT, `nonRecursiveEnabled`: BOOLEAN, `rerunGoalsEnabled`: BOOLEAN, `quietEnabled`: BOOLEAN, `offlineModeEnabled`: BOOLEAN, `rerunGoals`: BOOLEAN>, `hasFailed`: BOOLEAN, `tags`: ARRAY<STRING>, `values`: ARRAY<STRUCT<`name`: STRING, `value`: STRING>>, `links`: ARRAY<STRUCT<`label`: STRING, `URL`: STRING>>>,
`gradleBuildCachePerformance` STRUCT<`id`: STRING, `effectiveWorkUnitExecutionTime`: BIGINT, `workUnitFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>, `workUnitAvoidanceSavingsSummary`: STRUCT<`localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT, `total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT>, `effectiveTaskExecutionTime`: BIGINT, `serialTaskExecutionTime`: BIGINT, `serialWorkUnitExecutionTime`: BIGINT, `taskFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>, `avoidanceSavingsSummary`: STRUCT<`localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT, `total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT>, `taskAvoidanceSavingsSummary`: STRUCT<`localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT, `total`: BIGINT, `ratio`: DOUBLE, `upToDate`: BIGINT>, `serializationFactor`: DOUBLE, `taskExecution`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `skipReasonMessage`: STRING, `cacheArtifactSize`: BIGINT, `taskPath`: STRING, `taskType`: STRING>>, `buildTime`: BIGINT, `buildCaches`: STRUCT<`local`: STRUCT<`isDisabledDueToError`: BOOLEAN, `isPushEnabled`: BOOLEAN, `isEnabled`: BOOLEAN, `directory`: STRING>, `remote`: STRUCT<`type`: STRING, `className`: STRING, `isDisabledDueToError`: BOOLEAN, `isPushEnabled`: BOOLEAN, `isEnabled`: BOOLEAN, `URL`: STRING>, `overhead`: STRUCT<`uploading`: BIGINT, `downloading`: BIGINT, `packing`: BIGINT, `unpacking`: BIGINT>>>,
`mavenBuildCachePerformance` STRUCT<`id`: STRING, `effectiveProjectExecutionTime`: BIGINT, `serialProjectExecutionTime`: BIGINT, `goalFingerprintingSummary`: STRUCT<`count`: INT, `serialDuration`: BIGINT>, `avoidanceSavingsSummary`: STRUCT<`localBuildCache`: BIGINT, `remoteBuildCache`: BIGINT, `total`: BIGINT, `ratio`: DOUBLE>, `goalExecution`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `cacheArtifactSize`: BIGINT, `goalExecutionId`: STRING, `goalProjectName`: STRING, `goalName`: STRING, `mojoType`: STRING>>, `serializationFactor`: DOUBLE, `buildTime`: BIGINT, `buildCaches`: STRUCT<`local`: STRUCT<`isDisabledDueToError`: BOOLEAN, `isPushEnabled`: BOOLEAN, `isEnabled`: BOOLEAN, `directory`: STRING>, `remote`: STRUCT<`isDisabledDueToError`: BOOLEAN, `isPushEnabled`: BOOLEAN, `isEnabled`: BOOLEAN, `URL`: STRING>, `overhead`: STRUCT<`uploading`: BIGINT, `downloading`: BIGINT, `packing`: BIGINT, `unpacking`: BIGINT>>>,
`gradleProjects` ARRAY<STRUCT<`name`: STRING, `parent`: INT, `path`: STRING>>,
`mavenModules` ARRAY<STRUCT<`name`: STRING, `parent`: INT, `version`: STRING, `groupId`: STRING, `artifactId`: STRING>>,
`mavenDependencyResolution` STRUCT<`serialDependencyResolutionTime`: BIGINT, `serialNetworkRequestTime`: BIGINT, `wallClockNetworkRequestTime`: BIGINT, `networkRequestCount`: BIGINT, `fileDownloadSize`: BIGINT, `fileDownloadCount`: BIGINT>,
`gradleNetworkActivity` STRUCT<`serialNetworkRequestTime`: BIGINT, `wallClockNetworkRequestTime`: BIGINT, `networkRequestCount`: BIGINT, `fileDownloadSize`: BIGINT, `fileDownloadCount`: BIGINT>,
`gradleDeprecations` STRUCT<`deprecations`: ARRAY<STRUCT<`removalDetails`: STRING, `documentationUrl`: STRING, `summary`: STRING, `advice`: STRING, `usages`: ARRAY<STRUCT<`owner`: STRUCT<`location`: STRING, `type`: STRING>, `contextualAdvice`: STRING>>>>>,
`gradleArtifactTransformExecutions` STRUCT<`artifactTransformExecutions`: ARRAY<STRUCT<`duration`: BIGINT, `cacheKey`: STRING, `artifactTransformExecutionName`: STRING, `nonCacheabilityCategory`: STRING, `nonCacheabilityReason`: STRING, `cacheArtifactRejectedReason`: STRING, `fingerprintingDuration`: BIGINT, `transformActionType`: STRING, `inputArtifactName`: STRING, `changedAttributes`: ARRAY<STRUCT<`name`: STRING, `from`: STRING, `to`: STRING>>, `avoidanceOutcome`: STRING, `avoidanceSavings`: BIGINT, `skipReasonMessage`: STRING, `cacheArtifactSize`: BIGINT, `outcome`: STRING>>>
)
PARTITIONED BY(
`startdate` date
)
ROW FORMAT SERDE
'org.apache.hive.hcatalog.data.JsonSerDe'
WITH SERDEPROPERTIES(
'ignore.malformed.json'='true'
)
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'«BUCKET»/build-models'
TBLPROPERTIES(
'projection.enabled'='true',
'projection.startdate.format'='yyyy-MM-dd',
'projection.startdate.range'='2020-01-01,NOW',
'projection.startdate.type'='date',
'storage.location.template'='«BUCKET»/build-models/startdate=${startdate}/'
)
Develocity 2023.4.x
CREATE EXTERNAL TABLE `build`(
`id` string COMMENT 'from deserializer',
`buildstarttime` date COMMENT 'from deserializer',
`buildtoolversion` string COMMENT 'from deserializer',
`buildagentversion` string COMMENT 'from deserializer',
`modelversion` struct<year:int,string:string,patch:int,release:int> COMMENT 'from deserializer',
`gradleattributes` struct<id:string,tags:array<string>,values:array<struct<name:string,value:string>>,links:array<struct<label:string,URL:string>>,buildstarttime:bigint,buildduration:bigint,gradleversion:string,pluginversion:string,rootprojectname:string,requestedtasks:array<string>,hasfailed:boolean,hasverificationfailure:boolean,buildoptions:struct<buildcacheenabled:boolean,continuousbuildenabled:boolean,daemonenabled:boolean,dryrunenabled:boolean,excludedtasks:array<string>,offlinemodeenabled:boolean,reruntasksenabled:boolean,configurationcacheenabled:boolean,configurationondemandenabled:boolean,continueonfailureenabled:boolean,filesystemwatchingenabled:boolean,maxnumberofgradleworkers:int,parallelprojectexecutionenabled:boolean,refreshdependenciesenabled:boolean>,environment:struct<username:string,operatingsystem:string,numberofcpucores:int,jreversion:string,jvmversion:string,jvmmaxmemoryheapsize:bigint,jvmcharset:string,jvmlocale:string,publichostname:string,localhostname:string,localipaddresses:array<string>>,hasnonverificationfailure:boolean,gradleenterprisesettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,taskinputsfilecapturingenabled:boolean,testoutputcapturingenabled:boolean>,develocitysettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,taskinputsfilecapturingenabled:boolean,testoutputcapturingenabled:boolean>> COMMENT 'from deserializer',
`gradlebuildcacheperformance` struct<id:string,effectivetaskexecutiontime:bigint,effectiveworkunitexecutiontime:bigint,serialworkunitexecutiontime:bigint,taskfingerprintingsummary:struct<count:int,serialduration:bigint>,workunitfingerprintingsummary:struct<count:int,serialduration:bigint>,taskavoidancesavingssummary:struct<uptodate:bigint,localbuildcache:bigint,remotebuildcache:bigint,total:bigint,ratio:double>,workunitavoidancesavingssummary:struct<uptodate:bigint,localbuildcache:bigint,remotebuildcache:bigint,total:bigint,ratio:double>,buildtime:bigint,serialtaskexecutiontime:bigint,serializationfactor:double,taskexecution:array<struct<duration:bigint,cachekey:string,cacheartifactrejectedreason:string,fingerprintingduration:bigint,taskpath:string,tasktype:string,avoidanceoutcome:string,avoidancesavings:bigint,noncacheabilitycategory:string,noncacheabilityreason:string,skipreasonmessage:string,cacheartifactsize:bigint>>,avoidancesavingssummary:struct<uptodate:bigint,localbuildcache:bigint,remotebuildcache:bigint,total:bigint,ratio:double>,buildcaches:struct<local:struct<isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,directory:string>,overhead:struct<uploading:bigint,downloading:bigint,unpacking:bigint,packing:bigint>,remote:struct<type:string,classname:string,isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,URL:string>>> COMMENT 'from deserializer',
`gradleprojects` array<struct<name:string,parent:int,path:string>> COMMENT 'from deserializer',
`mavenattributes` struct<id:string,tags:array<string>,values:array<struct<name:string,value:string>>,links:array<struct<label:string,URL:string>>,buildstarttime:bigint,buildduration:bigint,hasfailed:boolean,hasverificationfailure:boolean,buildoptions:struct<batchmodeenabled:boolean,debugenabled:boolean,errorsenabled:boolean,failatendenabled:boolean,failfastenabled:boolean,failneverenabled:boolean,laxchecksumsenabled:boolean,maxnumberofthreads:int,nonrecursiveenabled:boolean,rerungoals:boolean,quietenabled:boolean,strictchecksumsenabled:boolean,updatesnapshotsenabled:boolean,offlinemodeenabled:boolean,nosnapshotsupdatesenabled:boolean>,environment:struct<username:string,operatingsystem:string,numberofcpucores:int,jreversion:string,jvmversion:string,jvmmaxmemoryheapsize:bigint,jvmcharset:string,jvmlocale:string,publichostname:string,localhostname:string,localipaddresses:array<string>>,mavenversion:string,extensionversion:string,toplevelprojectname:string,requestedgoals:array<string>,hasnonverificationfailure:boolean,gradleenterprisesettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,testoutputcapturingenabled:boolean,goalinputsfilecapturingenabled:boolean>,develocitysettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,testoutputcapturingenabled:boolean,goalinputsfilecapturingenabled:boolean>> COMMENT 'from deserializer',
`mavenbuildcacheperformance` struct<id:string,effectiveprojectexecutiontime:bigint,serialprojectexecutiontime:bigint,goalfingerprintingsummary:struct<count:int,serialduration:bigint>,buildtime:bigint,serializationfactor:double,avoidancesavingssummary:struct<localbuildcache:bigint,remotebuildcache:bigint,total:bigint,ratio:double>,buildcaches:struct<local:struct<isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,directory:string>,overhead:struct<uploading:bigint,downloading:bigint,unpacking:bigint,packing:bigint>,remote:struct<isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,URL:string>>,goalexecution:array<struct<duration:bigint,cachekey:string,cacheartifactrejectedreason:string,fingerprintingduration:bigint,avoidanceoutcome:string,avoidancesavings:bigint,noncacheabilitycategory:string,noncacheabilityreason:string,cacheartifactsize:bigint,goalname:string,mojotype:string,goalexecutionid:string,goalprojectname:string>>> COMMENT 'from deserializer',
`mavenmodules` array<struct<name:string,parent:int,version:string,groupid:string,artifactid:string>> COMMENT 'from deserializer',
`type` string COMMENT 'from deserializer'
)
PARTITIONED BY (
`startdate` date COMMENT ''
)
ROW FORMAT SERDE
'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES (
'ignore.malformed.json'='true'
)
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'«BUCKET»/build-models'
TBLPROPERTIES (
'projection.enabled'='true',
'projection.startdate.format'='yyyy-MM-dd',
'projection.startdate.range'='2020-01-01,NOW',
'projection.startdate.type'='date',
'storage.location.template'='«BUCKET»/build-models/startdate=${startdate}/
)
Develocity 2023.3.x
CREATE EXTERNAL TABLE `build`(
`id` string COMMENT 'from deserializer',
`buildstarttime` date COMMENT 'from deserializer',
`buildtoolversion` string COMMENT 'from deserializer',
`buildagentversion` string COMMENT 'from deserializer',
`modelversion` struct<year:int,string:string,patch:int,release:int> COMMENT 'from deserializer',
`gradleattributes` struct<id:string,tags:array<string>,values:array<struct<name:string,value:string>>,links:array<struct<label:string,URL:string>>,buildstarttime:bigint,buildduration:bigint,gradleversion:string,pluginversion:string,rootprojectname:string,requestedtasks:array<string>,hasfailed:boolean,hasverificationfailure:boolean,buildoptions:struct<buildcacheenabled:boolean,continuousbuildenabled:boolean,daemonenabled:boolean,dryrunenabled:boolean,excludedtasks:array<string>,offlinemodeenabled:boolean,reruntasksenabled:boolean,configurationcacheenabled:boolean,configurationondemandenabled:boolean,continueonfailureenabled:boolean,filesystemwatchingenabled:boolean,maxnumberofgradleworkers:int,parallelprojectexecutionenabled:boolean,refreshdependenciesenabled:boolean>,environment:struct<username:string,operatingsystem:string,numberofcpucores:int,jreversion:string,jvmversion:string,jvmmaxmemoryheapsize:bigint,jvmcharset:string,jvmlocale:string,publichostname:string,localhostname:string,localipaddresses:array<string>>,hasnonverificationfailure:boolean,gradleenterprisesettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,taskinputsfilecapturingenabled:boolean,testoutputcapturingenabled:boolean>> COMMENT 'from deserializer',
`gradlebuildcacheperformance` struct<id:string,buildtime:bigint,serialtaskexecutiontime:bigint,serializationfactor:double,taskexecution:array<struct<duration:bigint,taskpath:string,tasktype:string,avoidanceoutcome:string,fingerprintingduration:bigint,avoidancesavings:bigint,noncacheabilitycategory:string,noncacheabilityreason:string,skipreasonmessage:string,cacheartifactsize:bigint,cacheartifactrejectedreason:string>>,avoidancesavingssummary:struct<total:bigint,ratio:double,uptodate:bigint,localbuildcache:bigint,remotebuildcache:bigint>,buildcaches:struct<local:struct<isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,directory:string>,remote:struct<type:string,classname:string,URL:string,isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean>,overhead:struct<packing:bigint,uploading:bigint,downloading:bigint,unpacking:bigint>>,effectivetaskexecutiontime:bigint,effectiveworkunitexecutiontime:bigint,serialworkunitexecutiontime:bigint,taskfingerprintingsummary:struct<count:int,serialduration:bigint>> COMMENT 'from deserializer',
`gradleprojects` struct<models:array<struct<name:string,parent:int,path:string>>> COMMENT 'from deserializer',
`mavenattributes` struct<id:string,tags:array<string>,values:array<struct<name:string,value:string>>,links:array<struct<label:string,URL:string>>,buildstarttime:bigint,buildduration:bigint,hasfailed:boolean,hasverificationfailure:boolean,buildoptions:struct<batchmodeenabled:boolean,debugenabled:boolean,errorsenabled:boolean,failatendenabled:boolean,failfastenabled:boolean,failneverenabled:boolean,laxchecksumsenabled:boolean,maxnumberofthreads:int,nonrecursiveenabled:boolean,rerungoals:boolean,quietenabled:boolean,strictchecksumsenabled:boolean,updatesnapshotsenabled:boolean,offlinemodeenabled:boolean,nosnapshotsupdatesenabled:boolean>,environment:struct<username:string,operatingsystem:string,numberofcpucores:int,jreversion:string,jvmversion:string,jvmmaxmemoryheapsize:bigint,jvmcharset:string,jvmlocale:string,publichostname:string,localhostname:string,localipaddresses:array<string>>,mavenversion:string,extensionversion:string,toplevelprojectname:string,requestedgoals:array<string>,hasnonverificationfailure:boolean,gradleenterprisesettings:struct<backgroundpublicationenabled:boolean,buildoutputcapturingenabled:boolean,testoutputcapturingenabled:boolean,goalinputsfilecapturingenabled:boolean>> COMMENT 'from deserializer',
`mavenbuildcacheperformance` struct<id:string,buildtime:bigint,serializationfactor:double,avoidancesavingssummary:struct<total:bigint,ratio:double,localbuildcache:bigint,remotebuildcache:bigint>,buildcaches:struct<local:struct<isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean,directory:string>,remote:struct<URL:string,isenabled:boolean,ispushenabled:boolean,isdisabledduetoerror:boolean>,overhead:struct<packing:bigint,uploading:bigint,downloading:bigint,unpacking:bigint>>,goalexecution:array<struct<duration:bigint,avoidanceoutcome:string,fingerprintingduration:bigint,avoidancesavings:bigint,noncacheabilitycategory:string,noncacheabilityreason:string,cacheartifactsize:bigint,goalname:string,mojotype:string,goalexecutionid:string,goalprojectname:string,cacheartifactrejectedreason:string>>,effectiveprojectexecutiontime:bigint,serialprojectexecutiontime:bigint,goalfingerprintingsummary:struct<count:int,serialduration:bigint>> COMMENT 'from deserializer',
`mavenmodules` struct<models:array<struct<name:string,parent:int,version:string,groupid:string,artifactid:string>>> COMMENT 'from deserializer',
`type` string COMMENT 'from deserializer'
)
PARTITIONED BY (
`startdate` date COMMENT ''
)
ROW FORMAT SERDE
'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES (
'ignore.malformed.json'='true'
)
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'«BUCKET»/build-models'
TBLPROPERTIES (
'projection.enabled'='true',
'projection.startdate.format'='yyyy-MM-dd',
'projection.startdate.range'='2020-01-01,NOW',
'projection.startdate.type'='date',
'storage.location.template'='«BUCKET»/build-models/startdate=${startdate}/'
)