mirror of
https://github.com/AIDotNet/AntSK.git
synced 2026-02-17 14:06:11 +08:00
Update README.en.md
This commit is contained in:
48
README.en.md
48
README.en.md
@@ -53,28 +53,6 @@ Due to the low configuration of the cloud server, the local model cannot be run,
|
||||
### Other Function Examples
|
||||
[Video Demonstration](https://www.bilibili.com/video/BV1zH4y1h7Y9/)
|
||||
|
||||
First, you need to create a knowledge base
|
||||

|
||||
|
||||
You can import documents or URLs into the knowledge base
|
||||
Click to check the document slicing situation of the knowledge base
|
||||

|
||||
|
||||
Then we need to create an application, which can be a dialogue application or a knowledge base.
|
||||

|
||||
|
||||
For the knowledge base application, select the existing knowledge base, and multiple selections are possible
|
||||

|
||||
|
||||
Then in the dialogue, questions can be asked about the documents in the knowledge base
|
||||

|
||||
|
||||
Additionally, we can create dialogue applications and configure prompt word templates in the corresponding application
|
||||

|
||||
|
||||
Let's take a look at the effects below
|
||||

|
||||
|
||||
## How to get started?
|
||||
|
||||
Here I am using Postgres as the data and vector storage because Semantic Kernel and Kernel Memory support it, but you can also use other options.
|
||||
@@ -162,12 +140,6 @@ KernelMemory.VectorDb
|
||||
//Local model execution options: GPU and CPU. When using the online API, any option can be used.
|
||||
LLamaSharp.RunType
|
||||
|
||||
//Model path for local sessions. Note the distinction in file paths between Linux and Windows drives.
|
||||
LLamaSharp.Chat
|
||||
|
||||
//Model path for local vector models. Note the distinction in file paths between Linux and Windows drives.
|
||||
LLamaSharp.Embedding
|
||||
|
||||
//Local model path, used for quick selection of models under llama, as well as saving downloaded models.
|
||||
LLamaSharp.FileDirectory
|
||||
|
||||
@@ -206,6 +178,26 @@ I'm using CodeFirst mode for the database, so as long as the database connection
|
||||
8. Many people ask about the difference between LLamaSharp and llamafactory. In fact, LLamaSharp is a .NET implementation of llama.cpp, but only supports local gguf models, while llamafactory supports a wider variety of models and uses Python implementation. The main difference lies here. Additionally, llamafactory has the ability to fine-tune models, which is an area we will focus on integrating in the future.
|
||||
```
|
||||
|
||||
## 🤝 Contributing
|
||||
|
||||
[](https://github.com/AIDotNet/AntSK/pulls)
|
||||
|
||||
If you would like to contribute, feel free to create a [Pull Request](https://github.com/AIDotNet/AntSK/pulls), or give us [Bug Report](https://github.com/AIDotNet/AntSK/issues/new).
|
||||
|
||||
|
||||
## 💕 Contributors
|
||||
|
||||
This project exists thanks to all the people who contribute.
|
||||
|
||||
<a href="https://github.com/AIDotNet/AntSK/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=AIDotNet/AntSK&max=1000&columns=15&anon=1" />
|
||||
</a>
|
||||
|
||||
## 🚨 Code of Conduct
|
||||
|
||||
This project has adopted the code of conduct defined by the Contributor Covenant to clarify expected behavior in our community.
|
||||
For more information see the [.NET Foundation Code of Conduct](https://dotnetfoundation.org/code-of-conduct).
|
||||
|
||||
To learn more or get started with **AntSK**, follow my official WeChat account and join the discussion group.
|
||||
|
||||
## Contact Me
|
||||
|
||||
Reference in New Issue
Block a user