Skip to content

Commit c3c1356

Browse files
committed
Extend Hugging Face support and fix/update examples
1 parent d15d098 commit c3c1356

File tree

18 files changed

+412
-67
lines changed

18 files changed

+412
-67
lines changed

README.md

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -34,11 +34,3 @@ Help Symfony by [sponsoring](https://symfony.com/sponsor) its development!
3434
## Contributing
3535

3636
Thank you for considering contributing to Symfony AI! You can find the [contribution guide here](CONTRIBUTING.md).
37-
38-
## Fixture Licenses
39-
40-
For testing multi-modal features, the repository contains binary media content, with the following owners and licenses:
41-
42-
* `tests/Fixture/image.jpg`: Chris F., Creative Commons, see [pexels.com](https://www.pexels.com/photo/blauer-und-gruner-elefant-mit-licht-1680755/)
43-
* `tests/Fixture/audio.mp3`: davidbain, Creative Commons, see [freesound.org](https://freesound.org/people/davidbain/sounds/136777/)
44-
* `tests/Fixture/document.pdf`: Chem8240ja, Public Domain, see [Wikipedia](https://en.m.wikipedia.org/wiki/File:Re_example.pdf)

examples/huggingface/README.md

Lines changed: 128 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,128 @@
1+
# Symfony Hugging Face Examples
2+
3+
This directory contains various examples of how to use the Symfony AI with [Hugging Face](https://huggingface.co/)
4+
and sits on top of the [Hugging Face Inference API](https://huggingface.co/inference-api).
5+
6+
The Hugging Face Hub provides access to a wide range of pre-trained open source models for various AI tasks, which you
7+
can directly use via Symfony AI's Hugging Face Platform Bridge.
8+
9+
## Getting Started
10+
11+
Hugging Face offers a free tier for their Inference API, which you can use to get started. Therefore, you need to create
12+
an account on [Hugging Face](https://huggingface.co/join), generate an
13+
[access token](https://huggingface.co/settings/tokens), and add it to your `.env.local` file in the root of the
14+
examples' directory as `HUGGINGFACE_KEY`.
15+
16+
```bash
17+
echo 'HUGGINGFACE_KEY=hf_your_access_key' >> .env.local
18+
```
19+
20+
Different to other platforms, Hugging Face provides close to 50.000 models for various AI tasks, which enables you to
21+
easily try out different, specialized models for your use case. Common use cases can be found in this example directory.
22+
23+
## Running the Examples
24+
25+
You can run an example by executing the following command:
26+
27+
```bash
28+
# Run all example with runner:
29+
./runner huggingface
30+
31+
# Or run a specific example standalone, e.g., object detection:
32+
php huggingface/object-detection.php
33+
```
34+
35+
## Available Models
36+
37+
When running the examples, you might experience that some models are not available, and you encounter an error like:
38+
39+
```
40+
Model, provider or task not found (404).
41+
```
42+
43+
This can happen due to pre-selected models in the examples not being available anymore or not being "warmed up" on
44+
Hugging Face's side. You can change the model used in the examples by updating the model name in the example script.
45+
46+
To find available models for a specific task, you can check out the [Hugging Face Model Hub](https://huggingface.co/models)
47+
and filter by the desired task, or you can use the `huggingface/_model-listing.php` script.
48+
49+
### Listing Available Models
50+
51+
List _all_ models:
52+
53+
```bash
54+
php huggingface/_model-listing.php
55+
```
56+
(This is limited to 1000 results by default.)
57+
58+
Limit models to a specific _task_, e.g., object-detection:
59+
60+
```bash
61+
php huggingface/_model-listing.php --task=object-detection
62+
```
63+
64+
Limit models to a specific _provider_, e.g., "hf-inference":
65+
66+
```bash
67+
# Single provider:
68+
php huggingface/_model-listing.php --provider=hf-inference
69+
70+
# Multiple providers:
71+
php huggingface/_model-listing.php --provider=sambanova,novita
72+
```
73+
74+
Search for models matching a specific term, e.g., "gpt":
75+
76+
```bash
77+
php huggingface/_model-listing.php --search=gpt
78+
```
79+
80+
Limit models to currently warm models:
81+
82+
```bash
83+
php huggingface/_model-listing.php --warm
84+
```
85+
86+
You can combine task and provider filters, task and warm filters, but not provider and warm filters.
87+
88+
```bash
89+
# Combine provider and task:
90+
php huggingface/_model-listing.php --provider=hf-inference --task=object-detection
91+
92+
# Combine task and warm:
93+
php huggingface/_model-listing.php --task=object-detection --warm
94+
95+
# Search for warm gpt model for text-generation:
96+
php huggingface/_model-listing.php --warm --task=text-generation --search=gpt
97+
```
98+
99+
### Model Information
100+
101+
To get detailed information about a specific model, you can use the `huggingface/_model-info.php` script:
102+
103+
```bash
104+
php huggingface/_model-info.php google/vit-base-patch16-224
105+
106+
Hugging Face Model Information
107+
==============================
108+
109+
Model: google/vit-base-patch16-224
110+
----------- -----------------------------
111+
ID google/vit-base-patch16-224
112+
Downloads 2985836
113+
Likes 889
114+
Task image-classification
115+
Warm yes
116+
----------- -----------------------------
117+
118+
Inference Provider:
119+
----------------- -----------------------------
120+
Provider hf-inference
121+
Status live
122+
Provider ID google/vit-base-patch16-224
123+
Task image-classification
124+
Is Model Author no
125+
----------------- -----------------------------
126+
```
127+
128+
Important to understand is what you can use a model for and its availability on different providers.
Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
<?php
2+
3+
/*
4+
* This file is part of the Symfony package.
5+
*
6+
* (c) Fabien Potencier <fabien@symfony.com>
7+
*
8+
* For the full copyright and license information, please view the LICENSE
9+
* file that was distributed with this source code.
10+
*/
11+
12+
use Symfony\AI\Platform\Bridge\HuggingFace\ApiClient;
13+
use Symfony\Component\Console\Command\Command;
14+
use Symfony\Component\Console\Input\InputArgument;
15+
use Symfony\Component\Console\Input\InputInterface;
16+
use Symfony\Component\Console\Output\OutputInterface;
17+
use Symfony\Component\Console\SingleCommandApplication;
18+
use Symfony\Component\Console\Style\SymfonyStyle;
19+
20+
require_once dirname(__DIR__).'/bootstrap.php';
21+
22+
$app = (new SingleCommandApplication('Hugging Face Model Listing'))
23+
->setDescription('Inference information about a model on Hugging Face')
24+
->addArgument('model', InputArgument::REQUIRED, 'Name of the model to get information about')
25+
->setCode(function (InputInterface $input, OutputInterface $output) {
26+
$io = new SymfonyStyle($input, $output);
27+
$io->title('Hugging Face Model Information');
28+
29+
$model = $input->getArgument('model');
30+
$info = (new ApiClient())->getModel($model);
31+
32+
$io->text(sprintf('Model: %s', $model));
33+
34+
$io->horizontalTable(
35+
['ID', 'Downloads', 'Likes', 'Task', 'Warm'],
36+
[[
37+
$info['id'],
38+
$info['downloads'],
39+
$info['likes'],
40+
$info['pipeline_tag'],
41+
('warm' === ($info['inference'] ?? null)) ? 'yes' : 'no',
42+
]]
43+
);
44+
45+
$io->text('Inference Provider:');
46+
if (!isset($info['inferenceProviderMapping']) || [] === $info['inferenceProviderMapping']) {
47+
$io->text('<comment>No inference provider information available for this model.</comment>');
48+
$io->newLine();
49+
} else {
50+
$io->horizontalTable(
51+
['Provider', 'Status', 'Provider ID', 'Task', 'Is Model Author'],
52+
array_map(fn (string $provider, array $data) => [
53+
$provider,
54+
$data['status'],
55+
$data['providerId'],
56+
$data['task'],
57+
$data['isModelAuthor'] ? 'yes' : 'no',
58+
], array_keys($info['inferenceProviderMapping']), $info['inferenceProviderMapping'])
59+
);
60+
}
61+
62+
return Command::SUCCESS;
63+
})
64+
->run();

examples/huggingface/_model-listing.php

Lines changed: 16 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -20,28 +20,36 @@
2020

2121
require_once dirname(__DIR__).'/bootstrap.php';
2222

23-
$app = (new SingleCommandApplication('HuggingFace Model Listing'))
24-
->setDescription('Lists all available models on HuggingFace')
23+
$app = (new SingleCommandApplication('Hugging Face Model Listing'))
24+
->setDescription('Lists all available models on Hugging Face')
2525
->addOption('provider', 'p', InputOption::VALUE_REQUIRED, 'Name of the inference provider to filter models by')
2626
->addOption('task', 't', InputOption::VALUE_REQUIRED, 'Name of the task to filter models by')
27+
->addOption('warm', 'w', InputOption::VALUE_NONE, 'Only list models that are "warm" (i.e. ready for inference without cold start)')
28+
->addOption('search', 's', InputOption::VALUE_REQUIRED, 'Search term to filter models by')
2729
->setCode(function (InputInterface $input, OutputInterface $output) {
2830
$io = new SymfonyStyle($input, $output);
29-
$io->title('HuggingFace Model Listing');
31+
$io->title('Hugging Face Model Listing');
3032

3133
$provider = $input->getOption('provider');
3234
$task = $input->getOption('task');
35+
$search = $input->getOption('search');
36+
$warm = (bool) $input->getOption('warm');
3337

34-
$models = (new ApiClient())->models($provider, $task);
38+
$models = (new ApiClient())->getModels($provider, $task, $search, $warm);
3539

3640
if (0 === count($models)) {
37-
$io->error('No models found for the given provider and task.');
41+
$io->error('No models found for the given filters.');
3842

3943
return Command::FAILURE;
4044
}
4145

42-
$io->listing(
43-
array_map(fn (Model $model) => $model->getName(), $models)
44-
);
46+
$formatModel = function (Model $model) {
47+
return sprintf('%s <comment>[%s]</>', $model->getName(), implode(', ', $model->getOptions()['tags'] ?? []));
48+
};
49+
50+
$io->listing(array_map($formatModel, $models));
51+
52+
$io->success(sprintf('Found %d model(s).', count($models)));
4553

4654
return Command::SUCCESS;
4755
})

examples/huggingface/object-detection.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717

1818
$platform = PlatformFactory::create(env('HUGGINGFACE_KEY'), httpClient: http_client());
1919

20-
$image = Image::fromFile(dirname(__DIR__, 2).'/fixtures/image.jpg');
20+
$image = Image::fromFile(dirname(__DIR__, 2).'/fixtures/accordion.jpg');
2121
$result = $platform->invoke('facebook/detr-resnet-50', $image, [
2222
'task' => Task::OBJECT_DETECTION,
2323
]);

examples/huggingface/table-question-answering.php

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,12 +19,12 @@
1919
$input = [
2020
'query' => 'select year where city = beijing',
2121
'table' => [
22-
'year' => [1896, 1900, 1904, 2004, 2008, 2012],
22+
'year' => ['1896', '1900', '1904', '2004', '2008', '2012'],
2323
'city' => ['athens', 'paris', 'st. louis', 'athens', 'beijing', 'london'],
2424
],
2525
];
2626

27-
$result = $platform->invoke('microsoft/tapex-base', $input, [
27+
$result = $platform->invoke('google/tapas-base-finetuned-wtq', $input, [
2828
'task' => Task::TABLE_QUESTION_ANSWERING,
2929
]);
3030

examples/huggingface/text-generation.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616

1717
$platform = PlatformFactory::create(env('HUGGINGFACE_KEY'), httpClient: http_client());
1818

19-
$result = $platform->invoke('gpt2', 'The quick brown fox jumps over the lazy', [
19+
$result = $platform->invoke('katanemo/Arch-Router-1.5B', 'The quick brown fox jumps over the lazy', [
2020
'task' => Task::TEXT_GENERATION,
2121
]);
2222

fixtures/README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
# Fixture Licenses
2+
3+
For testing multi-modal features, the repository contains binary media content, with the following owners and licenses:
4+
5+
* `tests/Fixture/accordion.jpg`: Jefferson Lucena, Creative Commons, see [pexels.com](https://www.pexels.com/photo/man-playing-accordion-10153219/)
6+
* `tests/Fixture/audio.mp3`: davidbain, Creative Commons, see [freesound.org](https://freesound.org/people/davidbain/sounds/136777/)
7+
* `tests/Fixture/document.pdf`: Chem8240ja, Public Domain, see [Wikipedia](https://en.m.wikipedia.org/wiki/File:Re_example.pdf)
8+
* `tests/Fixture/image.jpg`: Chris F., Creative Commons, see [pexels.com](https://www.pexels.com/photo/blauer-und-gruner-elefant-mit-licht-1680755/)

fixtures/accordion.jpg

136 KB
Loading

src/platform/src/Bridge/HuggingFace/ApiClient.php

Lines changed: 64 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111

1212
namespace Symfony\AI\Platform\Bridge\HuggingFace;
1313

14+
use Symfony\AI\Platform\Exception\RuntimeException;
1415
use Symfony\AI\Platform\Model;
1516
use Symfony\Component\HttpClient\HttpClient;
1617
use Symfony\Contracts\HttpClient\HttpClientInterface;
@@ -27,17 +28,78 @@ public function __construct(
2728
}
2829

2930
/**
31+
* @return array{
32+
* id: string,
33+
* downloads: int,
34+
* likes: int,
35+
* pipeline_tag: string|null,
36+
* inference: string|null,
37+
* inferenceProviderMapping: array<string, array{
38+
* status: 'live'|'staging',
39+
* providerId: string,
40+
* task: string,
41+
* isModelAuthor: bool,
42+
* }>|null,
43+
* }
44+
*/
45+
public function getModel(string $modelId): array
46+
{
47+
$result = $this->httpClient->request('GET', 'https://huggingface.co/api/models/'.$modelId, [
48+
'query' => [
49+
'expand' => ['downloads', 'likes', 'pipeline_tag', 'inference', 'inferenceProviderMapping'],
50+
],
51+
]);
52+
53+
$data = $result->toArray(false);
54+
55+
if (isset($data['error'])) {
56+
throw new RuntimeException(\sprintf('Error fetching model info for "%s": "%s"', $modelId, $data['error']));
57+
}
58+
59+
return $data;
60+
}
61+
62+
/**
63+
* @param ?string $provider Filter by inference provider (see Provider::*)
64+
* @param ?string $task Filter by task (see Task::*)
65+
* @param ?string $search Search term to filter models by
66+
* @param bool $warm Filter for models with warm inference available
67+
*
3068
* @return Model[]
3169
*/
32-
public function models(?string $provider, ?string $task): array
70+
public function getModels(?string $provider = null, ?string $task = null, ?string $search = null, bool $warm = false): array
3371
{
3472
$result = $this->httpClient->request('GET', 'https://huggingface.co/api/models', [
3573
'query' => [
3674
'inference_provider' => $provider,
3775
'pipeline_tag' => $task,
76+
'search' => $search,
77+
...$warm ? ['inference' => 'warm'] : [],
3878
],
3979
]);
4080

41-
return array_map(fn (array $model) => new Model($model['id']), $result->toArray());
81+
$data = $result->toArray(false);
82+
83+
if (isset($data['error'])) {
84+
throw new RuntimeException(\sprintf('Error fetching models: "%s"', $data['error']));
85+
}
86+
87+
return array_map($this->convertToModel(...), $data);
88+
}
89+
90+
/**
91+
* @param array{
92+
* id: string,
93+
* pipeline_tag?: string,
94+
* } $data
95+
*/
96+
private function convertToModel(array $data): Model
97+
{
98+
return new Model(
99+
$data['id'],
100+
options: [
101+
'tags' => isset($data['pipeline_tag']) ? [$data['pipeline_tag']] : [],
102+
],
103+
);
42104
}
43105
}

0 commit comments

Comments
 (0)