Integrating DeepSeek API in NextJs and ExpressJs App
Rakesh Potnuru
written by Rakesh Potnuru5 min read

Integrating DeepSeek API in NextJs and ExpressJs App

Published from Publish Studio

AI is an infinity stone. Learn to use its power, you can do wonders. In this guide, build a personal accountant with DeepSeek API and a sample project.

ai personal accountant

The Project

I've created a sample project called "Finance Tracker" for this tutorial. It lets you record your financial transactions. The front end is built with NextJs and the back end is with tRPC (express adapter) and Postgres database with drizzle ORM. But you don't need to know tRPC to continue with this tutorial. (In case you want to learn tRPC, check out Build a Full-Stack App with tRPC and Next.js App Router series.)

Github repo: https://github.com/itsrakeshhq/finance-tracker

Let's add a chatbot to the product that acts as our personal accountant.

Backend

Getting DeepSeek API key

As you might already know, DeepSeek went pretty viral when it was launched because of comparable performance to OpenAI yet only a fraction of the cost. This resulted in massive downtimes since its launch. I've been trying to use their API for a long time but no luck. (If you can get it, then please continue with it.)

So instead of using API from the DeepSeek API platform directly, we can use it from OpenRouter. OpenRouter gives access to various AI models from various providers. So if one provider goes down, it switches to another one.

API Integration

Put the API key you got from above in backend/.env:

shell
1DEEPSEEK_API_KEY=your_deepseek_api_key

Since DeepSeek API is compatible with OpenAI SDK, let's install that:

shell
1yarn add openai

Create src/modules/ai/ai.controller.ts. This is where will write AI accountant code. First, create OpenAI client:

ts
1export default class AiController { 2 private readonly openai: OpenAI; 3 4 constructor() { 5 this.openai = new OpenAI({ 6 // baseURL: "https://api.deepseek.com" // if using DeepSeek API key 7 baseURL: "https://openrouter.ai/api/v1", 8 apiKey: process.env.DEEPSEEK_API_KEY, 9 }); 10 } 11}

Here we've initialized Openai client.

Note: Make sure to use appropriate baseURL and apiKey based on what you are using - DeepSeek or OpenRouter.

Here's how I designed the accountant:

  1. Fetch transactions from DB.
  2. Include in the prompt.
  3. Answer user queries based on transactions.
  4. Stream response.
ts
1... 2 async accountant(req: Request, res: Response) { 3 try { 4 const { query } = req.body; 5 6 if (!query) { 7 return res.status(400).json({ error: "Query is required" }); 8 } 9 10 const userId = req.user.id; 11 12 const data = await db 13 .select() 14 .from(transactions) 15 .where(eq(transactions.userId, userId)); 16 17 if (data.length === 0) { 18 return res.status(400).json({ error: "No transactions found" }); 19 } 20 21 const formattedTxns = data.map((txn) => ({ 22 amount: txn.amount, 23 txnType: txn.txnType, 24 summary: txn.summary, 25 tag: txn.tag, 26 date: txn.createdAt, 27 })); 28 29 const prompt = ` 30 You are a personal accountant. You are given a list of transactions. You need to answer user query based on the transactions. 31 32 YOU MUST KEEP THESE POINTS IN MIND WHILE ANSWERING THE USER QUERY: 33 1. You must give straight forward answer. 34 2. Answer like you are talking to the user. 35 3. You must not output your thinking process and reasoning. This is very important. 36 4. Answer should be in markdown format. 37 38 Transactions: ${JSON.stringify(formattedTxns)} 39 Currency: $ (USD) 40 41 User Query: ${query} 42 `; 43 44 const response = await this.openai.chat.completions.create({ 45 model: "deepseek/deepseek-chat", 46 messages: [{ role: "user", content: prompt }], 47 stream: true, 48 }); 49 50 res.writeHead(200, { 51 "Content-Type": "text/plain", 52 "transfer-encoding": "chunked", 53 }); 54 55 for await (const chunk of response) { 56 if (chunk.choices[0].finish_reason === "stop") { 57 break; 58 } 59 60 res.write(chunk.choices[0].delta.content || ""); 61 } 62 63 res.end(); 64 } catch (error) { 65 console.error({ error }); 66 if (!res.headersSent) { 67 res.status(500).json({ error: "Internal server error" }); 68 } 69 } 70 } 71...

Note: This is just to give an idea. In real projects it's not ideal to fetch all transactions from db for every query. To provide context to AI models, you can create embeddings. DeepSeek currently does not support embeddings.

As you can see all we did was write a prompt and provide as much context as possible to get accurate results. Then we are sending the response as a stream instead of waiting for the whole response which might take a lot of time.

And then expose this controller in a route:

ts
1// src/index.ts 2 3... 4app.use(express.json()); 5app.post("/ai", authMiddleware, (req, res) => 6 new AiController().accountant(req, res) 7); 8...

authMiddleware protects the endpoint, so only logged-in users can access it. You can find the implementation in src/middleware/auth-middleware.ts.

That's all we need from the backend.

Frontend

In the front end, let's create a chat widget. Switch to frontend/ folder.

Create chat.tsx in src/components/modules/dashboard.

Chat UI

Then create a chat box UI with shadcn popover component.

tsx
1// chat.tsx 2 3export default function Chat() { 4 const [conversation, setConversation] = useState< 5 { 6 role: "user" | "assistant"; 7 content: string; 8 }[] 9 >([ 10 { 11 role: "assistant", 12 content: "Hello, how can I help you today?", 13 }, 14 ]); 15 const [liveResponse, setLiveResponse] = useState<string>(""); 16 const [isThinking, setIsThinking] = useState<boolean>(false); 17 18 // Auto scroll to bottom when new message is added 19 const scrollRef = useRef<HTMLDivElement>(null); 20 useEffect(() => { 21 if (scrollRef.current) { 22 scrollRef.current.scrollIntoView({ behavior: "smooth" }); 23 } 24 }, [conversation, liveResponse]); 25 26 return ( 27 <Popover> 28 <PopoverTrigger className="absolute right-4 bottom-4" asChild> 29 <Button size={"icon"} className="rounded-full"> 30 <BotMessageSquareIcon className="w-4 h-4" /> 31 </Button> 32 </PopoverTrigger> 33 <PopoverContent align="end" className="w-[500px] h-[600px] p-0 space-y-4"> 34 <h1 className="text-xl font-bold text-center p-4 pb-0"> 35 Personal Accountant 36 </h1> 37 <hr /> 38 <div className="pt-0 relative h-full"> 39 <div className="flex flex-col gap-2 h-[calc(100%-150px)] overflow-y-auto px-4 pb-20"> 40 {conversation.map((message, index) => ( 41 <div 42 key={index} 43 className={cn("flex flex-row gap-2 items-start", { 44 "rounded-lg bg-muted p-2 ml-auto flex-row-reverse": 45 message.role === "user", 46 })} 47 > 48 {message.role === "assistant" && ( 49 <BotMessageSquareIcon className="w-4 h-4 shrink-0 mt-1.5" /> 50 )} 51 {message.role === "user" && ( 52 <UserRoundIcon className="w-4 h-4 shrink-0 mt-1" /> 53 )} 54 <Markdown className="prose prose-sm prose-h1:text-xl prose-h2:text-lg prose-h3:text-base"> 55 {message.content} 56 </Markdown> 57 </div> 58 ))} 59 {isThinking && ( 60 <div className="flex flex-row gap-2 items-center"> 61 <BotMessageSquareIcon className="w-4 h-4" /> 62 <p className="animate-pulse prose prose-sm">Thinking...</p> 63 </div> 64 )} 65 {liveResponse.length > 0 && ( 66 <div className="flex flex-row gap-2 items-start"> 67 <BotMessageSquareIcon className="w-4 h-4 shrink-0 mt-1.5" /> 68 <Markdown className="prose prose-sm prose-h1:text-xl prose-h2:text-lg prose-h3:text-base"> 69 {liveResponse} 70 </Markdown> 71 </div> 72 )} 73 <div ref={scrollRef} /> 74 </div> 75 <hr /> 76 </div> 77 </PopoverContent> 78 </Popover> 79 ); 80}

Form

Now create a form to handle user input:

tsx
1const formSchema = z.object({ 2 message: z.string().min(1), 3}); 4 5export default function Chat() { 6... 7 const form = useForm<z.infer<typeof formSchema>>({ 8 resolver: zodResolver(formSchema), 9 defaultValues: { 10 message: "", 11 }, 12 }); 13 14 ... 15 <hr /> 16 <div className="absolute bottom-20 left-0 right-0 p-4 w-full"> 17 <Form {...form}> 18 <form 19 onSubmit={form.handleSubmit(onSubmit)} 20 className="flex flex-row gap-2" 21 > 22 <FormField 23 control={form.control} 24 name="message" 25 render={({ field }) => ( 26 <FormItem className="flex-1"> 27 <FormControl> 28 <Input placeholder="Ask me anything..." {...field} /> 29 </FormControl> 30 <FormMessage /> 31 </FormItem> 32 )} 33 /> 34 <Button size={"icon"} type="submit"> 35 <SendIcon className="w-4 h-4" /> 36 </Button> 37 </form> 38 </Form> 39 </div> 40 ... 41...

Submit handler

Finally, implement the submit handler. As I said above, we are sending responses in stream, so it's better if also show the response as it is coming.

tsx
1... 2 const onSubmit = async (data: z.infer<typeof formSchema>) => { 3 setIsThinking(true); 4 setConversation((prev) => [ 5 ...prev, 6 { role: "user", content: data.message }, 7 ]); 8 form.reset(); 9 10 let liveResponse = ""; 11 try { 12 const response = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/ai`, { 13 method: "POST", 14 headers: { 15 "Content-Type": "application/json", 16 }, 17 credentials: "include", 18 body: JSON.stringify({ 19 query: data.message, 20 }), 21 }); 22 23 const reader = response.body?.getReader(); 24 const decoder = new TextDecoder(); 25 26 setIsThinking(false); 27 if (!reader) return; 28 29 while (true) { 30 const { done, value } = await reader.read(); 31 if (done) break; 32 33 const text = decoder.decode(value, { stream: true }); 34 35 liveResponse += text; 36 setLiveResponse(liveResponse); 37 } 38 39 setConversation((prev) => [ 40 ...prev, 41 { role: "assistant", content: liveResponse }, 42 ]); 43 } catch (error) { 44 console.error(error); 45 } finally { 46 setIsThinking(false); 47 setLiveResponse(""); 48 } 49 }; 50...

Note: Make sure to set NEXT_PUBLIC_API_URL in .env.local

That's it!

If you don't want to deal with all this, you can also use Vercel's AI SDK which has all the parts you need.


Feel free to ask any questions in the comments below.

Btw, subscribe to my newsletter (find the form in footer 👇 or sidebar 👉).


Interested in
working
with me?

Let's Connect

Stay in touch 👇

© 2021 - 2025 itsrakesh. v2.