Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -73,22 +73,22 @@ pinned: false
|
|
| 73 |
<p> </p>
|
| 74 |
</div>
|
| 75 |
<div class="lg:col-span-3">
|
| 76 |
-
<h1>
|
| 77 |
-
<h3>Intel
|
| 78 |
<p class="mb-2">
|
| 79 |
To get started with Intel hardware and software optimizations, download and install the Optimum Intel
|
| 80 |
-
and Intel® Extension for Transformers libraries.
|
| 81 |
</p>
|
| 82 |
<ul>
|
| 83 |
-
<li class="ml-6"><a href="https://github.com/huggingface/optimum-intel" class="underline" data-ga-category="intel-org" data-ga-action="clicked optimum intel" data-ga-label="optimum intel">🤗 Optimum Intel library</a></li>
|
| 84 |
-
<li class="ml-6"><a href="https://github.com/intel/intel-extension-for-transformers" class="underline" data-ga-category="intel-org" data-ga-action="clicked intel extension for transformers" data-ga-label="intel extension for transformers">Intel® Extension for Transformers</a></li>
|
| 85 |
</ul>
|
| 86 |
<p class="mb-2">
|
| 87 |
The Optimum Intel library provides primarily hardware acceleration, while the Intel® Extension
|
| 88 |
for Transformers is focused more on software accleration. Both should be present to achieve ideal
|
| 89 |
performance and productivity gains in transfer learning and fine-tuning with Hugging Face.
|
| 90 |
</p>
|
| 91 |
-
<h3>Find
|
| 92 |
<p class="mb-2">
|
| 93 |
Next, find your desired model (and dataset) by using the search box at the top-left of Hugging Face’s website.
|
| 94 |
Add “intel” to your search to narrow your search to models pretrained by Intel.
|
|
@@ -98,7 +98,7 @@ pinned: false
|
|
| 98 |
src="https://huggingface.co/spaces/Intel/README/resolve/main/hf-model_search.png"
|
| 99 |
style="margin:auto;transform:scale(0.8);"
|
| 100 |
/>
|
| 101 |
-
<h3>Demo,
|
| 102 |
<p class="mb-2">
|
| 103 |
On the model’s page (called a “Model Card”) you will find description and usage information, an embedded
|
| 104 |
inferencing demo, and the associated dataset. In the upper-right of your screen, click “Use in Transformers”
|
|
@@ -115,4 +115,4 @@ pinned: false
|
|
| 115 |
style="margin:auto;transform:scale(0.8);"
|
| 116 |
/>
|
| 117 |
</div>
|
| 118 |
-
</div>
|
|
|
|
| 73 |
<p> </p>
|
| 74 |
</div>
|
| 75 |
<div class="lg:col-span-3">
|
| 76 |
+
<h1>Get Started</h1>
|
| 77 |
+
<h3>1. Intel Acceleration Libraries</h3>
|
| 78 |
<p class="mb-2">
|
| 79 |
To get started with Intel hardware and software optimizations, download and install the Optimum Intel
|
| 80 |
+
and Intel® Extension for Transformers libraries. Follow these documents to learn how to install and use these libraries:
|
| 81 |
</p>
|
| 82 |
<ul>
|
| 83 |
+
<li class="ml-6"><a href="https://github.com/huggingface/optimum-intel#readme" class="underline" data-ga-category="intel-org" data-ga-action="clicked optimum intel" data-ga-label="optimum intel">🤗 Optimum Intel library</a></li>
|
| 84 |
+
<li class="ml-6"><a href="https://github.com/intel/intel-extension-for-transformers#readme" class="underline" data-ga-category="intel-org" data-ga-action="clicked intel extension for transformers" data-ga-label="intel extension for transformers">Intel® Extension for Transformers</a></li>
|
| 85 |
</ul>
|
| 86 |
<p class="mb-2">
|
| 87 |
The Optimum Intel library provides primarily hardware acceleration, while the Intel® Extension
|
| 88 |
for Transformers is focused more on software accleration. Both should be present to achieve ideal
|
| 89 |
performance and productivity gains in transfer learning and fine-tuning with Hugging Face.
|
| 90 |
</p>
|
| 91 |
+
<h3>2. Find Your Model</h3>
|
| 92 |
<p class="mb-2">
|
| 93 |
Next, find your desired model (and dataset) by using the search box at the top-left of Hugging Face’s website.
|
| 94 |
Add “intel” to your search to narrow your search to models pretrained by Intel.
|
|
|
|
| 98 |
src="https://huggingface.co/spaces/Intel/README/resolve/main/hf-model_search.png"
|
| 99 |
style="margin:auto;transform:scale(0.8);"
|
| 100 |
/>
|
| 101 |
+
<h3>3. Read Through the Demo, Dataset, and Quick-Start Commands</h3>
|
| 102 |
<p class="mb-2">
|
| 103 |
On the model’s page (called a “Model Card”) you will find description and usage information, an embedded
|
| 104 |
inferencing demo, and the associated dataset. In the upper-right of your screen, click “Use in Transformers”
|
|
|
|
| 115 |
style="margin:auto;transform:scale(0.8);"
|
| 116 |
/>
|
| 117 |
</div>
|
| 118 |
+
</div>
|