Skip to content

Commit d2e8722

Browse files
authored
Merge branch 'main' into Blargian-patch-969026
2 parents 70f012c + badc94a commit d2e8722

File tree

62 files changed

+1298
-496
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

62 files changed

+1298
-496
lines changed

docs/about-us/support.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,15 +9,15 @@ doc_type: 'reference'
99

1010
# ClickHouse Cloud support services
1111

12-
ClickHouse provides Support Services for our ClickHouse Cloud users and customers. Our objective is a Support Services team that represents the ClickHouse product – unparalleled performance, ease of use, and exceptionally fast, high-quality results. For details, [visit our ClickHouse Support Program](https://clickhouse.com/support/program/) page.
12+
ClickHouse provides support services for our ClickHouse Cloud users and customers. Our objective is a support services team that represents the ClickHouse product – unparalleled performance, ease of use, and exceptionally fast, high-quality results. For details, [visit our ClickHouse Support Program](https://clickhouse.com/support/program/) page.
1313

1414
[Login to the Cloud console](https://console.clickhouse.cloud/support) and select **Help -> Support** from the menu options to open a new support case and view the status of your submitted cases.
1515

1616
You can also subscribe to our [status page](https://status.clickhouse.com) to get notified quickly about any incidents affecting our platform.
1717

1818
:::note
19-
Please note that only Subscription Customers have a Service Level Agreement on Support Incidents. If you are not currently a ClickHouse Cloud user – while we will try to answer your question, we'd encourage you to go instead to one of our Community resources:
19+
Please note that only subscription customers have a service level agreement on support incidents. If you are not currently a ClickHouse Cloud user – while we will try to answer your question, we'd encourage you to go instead to one of our community resources:
2020

21-
- [ClickHouse Community Slack Channel](https://clickhouse.com/slack)
22-
- [Other Community Options](https://github.com/ClickHouse/ClickHouse/blob/master/README.md#useful-links)
21+
- [ClickHouse community Slack channel](https://clickhouse.com/slack)
22+
- [Other community options](https://github.com/ClickHouse/ClickHouse/blob/master/README.md#useful-links)
2323
:::

docs/best-practices/select_data_type.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
slug: /best-practices/select-data-types
33
sidebar_position: 10
4-
sidebar_label: 'Selecting data Types'
5-
title: 'Selecting data Types'
4+
sidebar_label: 'Selecting data types'
5+
title: 'Selecting data types'
66
description: 'Page describing how to choose data types in ClickHouse'
77
keywords: ['data types']
88
doc_type: 'reference'

docs/integrations/data-ingestion/apache-spark/spark-jdbc.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,12 @@ doc_type: 'guide'
1111
import Tabs from '@theme/Tabs';
1212
import TabItem from '@theme/TabItem';
1313
import TOCInline from '@theme/TOCInline';
14+
import ClickHouseSupportedBadge from '@theme/badges/ClickHouseSupported';
1415

1516
# Spark JDBC
17+
18+
<ClickHouseSupportedBadge/>
19+
1620
JDBC is one of the most commonly used data sources in Spark.
1721
In this section, we will provide details on how to
1822
use the [ClickHouse official JDBC connector](/integrations/language-clients/java/jdbc) with Spark.

docs/integrations/data-ingestion/aws-glue/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,12 @@ import Tabs from '@theme/Tabs';
1313
import TabItem from '@theme/TabItem';
1414
import notebook_connections_config from '@site/static/images/integrations/data-ingestion/aws-glue/notebook-connections-config.png';
1515
import dependent_jars_path_option from '@site/static/images/integrations/data-ingestion/aws-glue/dependent_jars_path_option.png';
16+
import ClickHouseSupportedBadge from '@theme/badges/ClickHouseSupported';
1617

1718
# Integrating Amazon Glue with ClickHouse and Spark
1819

20+
<ClickHouseSupportedBadge/>
21+
1922
[Amazon Glue](https://aws.amazon.com/glue/) is a fully managed, serverless data integration service provided by Amazon Web Services (AWS). It simplifies the process of discovering, preparing, and transforming data for analytics, machine learning, and application development.
2023

2124
## Installation {#installation}

docs/integrations/data-ingestion/azure-data-factory/overview.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,12 @@ title: 'Bringing Azure Data into ClickHouse'
77
doc_type: 'guide'
88
---
99

10+
import ClickHouseSupportedBadge from '@theme/badges/ClickHouseSupported';
11+
1012
# Bringing Azure Data into ClickHouse
1113

14+
<ClickHouseSupportedBadge/>
15+
1216
Microsoft Azure offers a wide range of tools to store, transform, and analyze
1317
data. However, in many scenarios, ClickHouse can provide significantly better
1418
performance for low-latency querying and processing of huge datasets. In

docs/integrations/data-ingestion/azure-data-factory/using_azureblobstorage.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@ doc_type: 'guide'
88
---
99

1010
import Image from '@theme/IdealImage';
11-
1211
import azureDataStoreSettings from '@site/static/images/integrations/data-ingestion/azure-data-factory/azure-data-store-settings.png';
1312
import azureDataStoreAccessKeys from '@site/static/images/integrations/data-ingestion/azure-data-factory/azure-data-store-access-keys.png';
1413

docs/integrations/data-ingestion/azure-synapse/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,12 @@ import TOCInline from '@theme/TOCInline';
1111
import Image from '@theme/IdealImage';
1212
import sparkConfigViaNotebook from '@site/static/images/integrations/data-ingestion/azure-synapse/spark_notebook_conf.png';
1313
import sparkUICHSettings from '@site/static/images/integrations/data-ingestion/azure-synapse/spark_ui_ch_settings.png';
14+
import ClickHouseSupportedBadge from '@theme/badges/ClickHouseSupported';
1415

1516
# Integrating Azure Synapse with ClickHouse
1617

18+
<ClickHouseSupportedBadge/>
19+
1720
[Azure Synapse](https://azure.microsoft.com/en-us/products/synapse-analytics) is an integrated analytics service that combines big data, data science and warehousing to enable fast, large-scale data analysis.
1821
Within Synapse, Spark pools provide on-demand, scalable [Apache Spark](https://spark.apache.org) clusters that let users run complex data transformations, machine learning, and integrations with external systems.
1922

docs/integrations/data-ingestion/etl-tools/airbyte-and-clickhouse.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@ import airbyte06 from '@site/static/images/integrations/data-ingestion/etl-tools
1818
import airbyte07 from '@site/static/images/integrations/data-ingestion/etl-tools/airbyte_07.png';
1919
import airbyte08 from '@site/static/images/integrations/data-ingestion/etl-tools/airbyte_08.png';
2020
import airbyte09 from '@site/static/images/integrations/data-ingestion/etl-tools/airbyte_09.png';
21-
import CommunityMaintainedBadge from '@theme/badges/CommunityMaintained';
21+
import PartnerBadge from '@theme/badges/PartnerBadge';
2222

2323
# Connect Airbyte to ClickHouse
2424

25-
<CommunityMaintainedBadge/>
25+
<PartnerBadge/>
2626

2727
:::note
2828
Please note that the Airbyte source and destination for ClickHouse are currently in Alpha status and not suitable for moving large datasets (> 10 million rows)

docs/integrations/data-ingestion/etl-tools/bladepipe-and-clickhouse.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@ import bp_ck_6 from '@site/static/images/integrations/data-ingestion/etl-tools/b
1818
import bp_ck_7 from '@site/static/images/integrations/data-ingestion/etl-tools/bp_ck_7.png';
1919
import bp_ck_8 from '@site/static/images/integrations/data-ingestion/etl-tools/bp_ck_8.png';
2020
import bp_ck_9 from '@site/static/images/integrations/data-ingestion/etl-tools/bp_ck_9.png';
21-
import CommunityMaintainedBadge from '@theme/badges/CommunityMaintained';
21+
import PartnerBadge from '@theme/badges/PartnerBadge';
2222

2323
# Connect BladePipe to ClickHouse
2424

25-
<CommunityMaintainedBadge/>
25+
<PartnerBadge/>
2626

2727
<a href="https://www.bladepipe.com/" target="_blank">BladePipe</a> is a real-time end-to-end data integration tool with sub-second latency, boosting seamless data flow across platforms.
2828

docs/integrations/data-ingestion/etl-tools/dlt-and-clickhouse.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,11 @@ slug: /integrations/data-ingestion/etl-tools/dlt-and-clickhouse
77
doc_type: 'guide'
88
---
99

10-
import CommunityMaintainedBadge from '@theme/badges/CommunityMaintained';
10+
import PartnerBadge from '@theme/badges/PartnerBadge';
1111

1212
# Connect dlt to ClickHouse
1313

14-
<CommunityMaintainedBadge/>
14+
<PartnerBadge/>
1515

1616
<a href="https://dlthub.com/docs/intro" target="_blank">dlt</a> is an open-source library that you can add to your Python scripts to load data from various and often messy data sources into well-structured, live datasets.
1717

0 commit comments

Comments
 (0)