title | description | mode |
---|---|---|
Quickstart |
Start building AI features in under five minutes |
wide |
In this quickstart we'll show you how to get set up with Hypermode and build an intelligent API that you can integrate into your app. You'll learn how to use the basic components of a Modus app and how to deploy it to Hypermode.
- Node.js - v22 or higher
- Text editor - we recommend VS Code
- Terminal - access Modus through a command-line interface (CLI)
- GitHub Account
To get started, [create your first Modus app](/modus/quickstart). You can import this app into
Hypermode in the next step.
```bash
npm install -g @hypermode/hyp-cli
```
From the terminal, run the following command to import your Modus app into Hypermode. This command
will create your Hypermode project and deploy your app.
```bash
hyp link
```
</Tab>
</Tabs>
When Hypermode creates your project, a runtime is initiated for your app as well as connections to
any [Hypermode-hosted models](/hosted-models).
From the **Query** page, you can run a sample query to verify it's working as expected. In the following
query, we're going to use the `generateText` function to generate text from the shared Meta Llama
3.1 model based on the prompt "How are black holes created?"
```GraphQL
query myPrompt {
generateText(text:"How are black holes created?")
}
```
```json
{
"model": "meta-llama/Meta-Llama-3.1-8B-Instruct",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant. Limit your answers to 150 words."
},
{
"role": "user",
"content": "How are black holes created?"
}
],
"max_tokens": 200,
"temperature": 0.7
}
```
<img
className="block"
src="/images/hyp-quickstart/inference-history.png"
alt="Hypermode's console showing the inputs and outputs of the last model inference."
/>
Our API is responding using language that is more formal than we want. Let's update our
`generateText` function to respond using exclusively surfing analogies.
<Tabs>
<Tab title="Go">
Go to the `main.go` file and locate the `generateText` function. Modify the function to only respond like a surfer, like this:
```ts main.go
func GenerateText(text string) (string, error) {
model, err := models.GetModel[openai.ChatModel]("text-generator")
if err != nil {
return "", err
}
input, err := model.CreateInput(
openai.NewSystemMessage("You are a helpful assistant. Only respond using surfing analogies and metaphors."),
openai.NewUserMessage(text),
)
if err != nil {
return "", err
}
output, err := model.Invoke(input)
if err != nil {
return "", err
}
return strings.TrimSpace(output.Choices[0].Message.Content), nil
}
```
</Tab>
<Tab title="AssemblyScript">
Go to the `index.ts` file and locate the `generateText` function. Modify the function to only respond like a surfer, like this:
```ts index.ts
export function generateText(text: string): string {
const model = models.getModel<OpenAIChatModel>("text-generator")
const input = model.createInput([
new SystemMessage(
"You are a helpful assistant. Only respond using surfing analogies and metaphors.",
),
new UserMessage(text),
])
const output = model.invoke(input)
return output.choices[0].message.content.trim()
}
```
</Tab>
</Tabs>
Save the file and push an update to your Git repo. Hypermode automatically redeploys
whenever you push an update to the target branch in your Git repo. Go back to the Hypermode Console
and run the same query as before. You should see the response now uses surfing analogies!
<img
className="block"
src="/images/hyp-quickstart/graphiql-surfing.png"
alt="Hypermode's console showing results of new query."
/>
Hypermode and Modus provide a powerful platform for building and hosting AI models, data, and logic. You now know the basics of Hypermode. There's no limit to what you can build.
And when you're ready to integrate Hypermode into your app, that's as simple as calling a GraphQL endpoint.