Skip to content

Commit 78599d7

Browse files
committed
fix broken links
1 parent de4d4a3 commit 78599d7

File tree

7 files changed

+351
-17
lines changed

7 files changed

+351
-17
lines changed

docs/chdb/guides/index.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,13 @@ in the table of contents, please edit the frontmatter of the files directly.
1414
-->
1515

1616
<!--AUTOGENERATED_START-->
17+
| Page | Description |
18+
|-----|-----|
19+
| [How to query a remote ClickHouse server](/chdb/guides/query-remote-clickhouse) | In this guide, we will learn how to query a remote ClickHouse server from chDB. |
20+
| [How to query Apache Arrow with chDB](/chdb/guides/apache-arrow) | In this guide, we will learn how to query Apache Arrow tables with chDB |
21+
| [How to query data in an S3 bucket](/chdb/guides/querying-s3) | Learn how to query data in an S3 bucket with chDB. |
22+
| [How to query Pandas DataFrames with chDB](/chdb/guides/pandas) | Learn how to query Pandas DataFrames with chDB |
23+
| [How to query Parquet files](/chdb/guides/querying-parquet) | Learn how to query Parquet files with chDB. |
24+
| [JupySQL and chDB](/chdb/guides/jupysql) | How to install chDB for Bun |
25+
| [Using a clickhouse-local database](/chdb/guides/clickhouse-local) | Learn how to use a clickhouse-local database with chDB |
1726
<!--AUTOGENERATED_END-->

docs/cloud/onboard/02_migrate/01_migration_guides/01_overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,4 +30,4 @@ There are several options for migrating data into ClickHouse Cloud, depending on
3030
- [Anywhere!](/cloud/migration/etl-tool-to-clickhouse): use one of the many popular ETL/ELT tools that connect to all kinds of different data sources
3131
- [Object Storage](/integrations/migration/object-storage-to-clickhouse): easily insert data from S3 into ClickHouse
3232

33-
In the example [Migrate from Redshift](/integrations/data-ingestion/redshift/index.md), we present three different ways to migrate data to ClickHouse.
33+
In the example [Migrate from Redshift](/migrations/redshift/migration-guide), we present three different ways to migrate data to ClickHouse.

docs/faq/integration/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ description: 'Landing page listing questions related to integrating ClickHouse w
1313
- [How to import JSON into ClickHouse?](/integrations/data-ingestion/data-formats/json/intro.md)
1414
- [How do I connect Kafka to ClickHouse?](/integrations/data-ingestion/kafka/index.md)
1515
- [Can I connect my Java application to ClickHouse?](/integrations/data-ingestion/dbms/jdbc-with-clickhouse.md)
16-
- [Can ClickHouse read tables from MySQL?](/integrations/data-ingestion/dbms/mysql/index.md)
16+
- [Can ClickHouse read tables from MySQL?](/integrations/mysql)
1717
- [Can ClickHouse read tables from PostgreSQL](/integrations/data-ingestion/dbms/postgresql/connecting-to-postgresql.md)
1818
- [What if I have a problem with encodings when connecting to Oracle via ODBC?](/faq/integration/oracle-odbc.md)
1919

docs/getting-started/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,9 +37,12 @@ by https://github.com/ClickHouse/clickhouse-docs/blob/main/scripts/autogenerate-
3737
| [Anonymized Web Analytics](/getting-started/example-datasets/metrica) | Dataset consisting of two tables containing anonymized web analytics data with hits and visits |
3838
| [Brown University Benchmark](/getting-started/example-datasets/brown-benchmark) | A new analytical benchmark for machine-generated log data |
3939
| [COVID-19 Open-Data](/getting-started/example-datasets/covid19) | COVID-19 Open-Data is a large, open-source database of COVID-19 epidemiological data and related factors like demographics, economics, and government responses |
40+
| [dbpedia dataset](/getting-started/example-datasets/dbpedia-dataset) | Dataset containing 1 million articles from Wikipedia and their vector embeddings |
4041
| [Environmental Sensors Data](/getting-started/example-datasets/environmental-sensors) | Over 20 billion records of data from Sensor.Community, a contributors-driven global sensor network that creates Open Environmental Data. |
4142
| [Foursquare places](/getting-started/example-datasets/foursquare-places) | Dataset with over 100 million records containing information about places on a map, such as shops, restaurants, parks, playgrounds, and monuments. |
4243
| [GitHub Events Dataset](/getting-started/example-datasets/github-events) | Dataset containing all events on GitHub from 2011 to Dec 6 2020, with a size of 3.1 billion records. |
44+
| [Hacker News dataset](/getting-started/example-datasets/hacker-news) | Dataset containing 28 million rows of hacker news data. |
45+
| [LAION 5B dataset](/getting-started/example-datasets/laion-5b-dataset) | Dataset containing 100 million vectors from the LAION 5B dataset |
4346
| [Laion-400M dataset](/getting-started/example-datasets/laion-400m-dataset) | Dataset containing 400 million images with English image captions |
4447
| [New York Public Library "What's on the Menu?" Dataset](/getting-started/example-datasets/menus) | Dataset containing 1.3 million records of historical data on the menus of hotels, restaurants and cafes with the dishes along with their prices. |
4548
| [NYPD Complaint Data](/getting-started/example-datasets/nypd_complaint_data) | Ingest and query Tab Separated Value data in 5 steps |

docs/integrations/data-ingestion/clickpipes/kafka/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ title: 'Kafka ClickPipes'
99
| Page | Description |
1010
|-----|-----|
1111
| [Reference](/integrations/clickpipes/kafka/reference) | Details supported formats, sources, delivery semantics, authentication and experimental features supported by Kafka ClickPipes |
12-
| [Schema registries for Kafka ClickPipe](/integrations/clickpipes/kafka/schema-registries) | Information on schema registries for Kafka ClickPipe |
12+
| [Schema registries for Kafka ClickPipe](/integrations/clickpipes/kafka/schema-registries) | How to integrate for ClickPipes with a schema registry for schema management |
1313
| [Creating your first Kafka ClickPipe](/integrations/clickpipes/kafka/create-your-first-kafka-clickpipe) | Step-by-step guide to creating your first Kafka ClickPipe. |
14-
| [Kafka ClickPipes FAQ](/integrations/clickpipes/kafka/faq) | Frequently asked questions about Kafka ClickPipes |
14+
| [Kafka ClickPipes FAQ](/integrations/clickpipes/kafka/faq) | Frequently asked questions about ClickPipes for Kafka |
1515
| [Best practices](/integrations/clickpipes/kafka/best-practices) | Details best practices to follow when working with Kafka ClickPipes |
1616
<!--AUTOGENERATED_END-->

0 commit comments

Comments
 (0)