Update README.md
Browse files
README.md
CHANGED
|
@@ -125,6 +125,18 @@ Then click Download.
|
|
| 125 |
|
| 126 |
## How to use with Ollama
|
| 127 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 128 |
### Building from `Modelfile`
|
| 129 |
|
| 130 |
Assuming that you have already downloaded GGUF files, here is how you can use them with [Ollama](https://ollama.com/):
|
|
@@ -132,22 +144,22 @@ Assuming that you have already downloaded GGUF files, here is how you can use th
|
|
| 132 |
1. **Get the Modelfile:**
|
| 133 |
|
| 134 |
```
|
| 135 |
-
huggingface-cli download
|
| 136 |
```
|
| 137 |
|
| 138 |
2. Build the Ollama Model:
|
| 139 |
Use the Ollama CLI to create your model with the following command:
|
| 140 |
|
| 141 |
```
|
| 142 |
-
ollama create
|
| 143 |
```
|
| 144 |
|
| 145 |
-
3. **Run the *
|
| 146 |
|
| 147 |
-
Now you can run the
|
| 148 |
|
| 149 |
```
|
| 150 |
-
ollama run
|
| 151 |
```
|
| 152 |
|
| 153 |
Replace "Your prompt here" with the actual prompt you want to use for generating responses from the model.
|
|
|
|
| 125 |
|
| 126 |
## How to use with Ollama
|
| 127 |
|
| 128 |
+
1. **Install Ollama:**
|
| 129 |
+
|
| 130 |
+
```
|
| 131 |
+
curl -fsSL https://ollama.com/install.sh | sh
|
| 132 |
+
```
|
| 133 |
+
|
| 134 |
+
2. **Run the *NT-Java* model:**
|
| 135 |
+
|
| 136 |
+
```
|
| 137 |
+
ollama run NT-Java
|
| 138 |
+
```
|
| 139 |
+
|
| 140 |
### Building from `Modelfile`
|
| 141 |
|
| 142 |
Assuming that you have already downloaded GGUF files, here is how you can use them with [Ollama](https://ollama.com/):
|
|
|
|
| 144 |
1. **Get the Modelfile:**
|
| 145 |
|
| 146 |
```
|
| 147 |
+
huggingface-cli download infosys/NT-Java-1.1B-GGUF Modelfile_q4_k_m --local-dir /path/to/your/local/dir
|
| 148 |
```
|
| 149 |
|
| 150 |
2. Build the Ollama Model:
|
| 151 |
Use the Ollama CLI to create your model with the following command:
|
| 152 |
|
| 153 |
```
|
| 154 |
+
ollama create NT-Java -f Modelfile_q4_k_m
|
| 155 |
```
|
| 156 |
|
| 157 |
+
3. **Run the *NT-Java* model:**
|
| 158 |
|
| 159 |
+
Now you can run the NT-Java model with Ollama using the following command:
|
| 160 |
|
| 161 |
```
|
| 162 |
+
ollama run NT-Java "Your prompt here"
|
| 163 |
```
|
| 164 |
|
| 165 |
Replace "Your prompt here" with the actual prompt you want to use for generating responses from the model.
|