How To Integrate DeepSeek AI into your app: React and React Native Guide
Deepseek AI, One of the new Chinese innovation when it comes to Generative AI. They shocked the world and the AI landscape with their innovation, and with less than 10% than the cost of Chatgpt
Introduction
We’ve witnessed an unprecedented race in generative AI development in the last year. While OpenAI’s GPT series and Anthropic’s Claude have dominated headlines, open-source alternatives like Meta’s Llama and DeepSeek are rapidly gaining traction. These open-source models are particularly interesting because they offer comparable performance while providing more flexibility and control over deployment.
Companies are racing to create an AGI that can solve all the humans problems, although the road is still far for that in my opinion, but we are getting closer each time, and billions of dollars are spent on that. For example the 500 billions projects in USA and the 50 billions AI investment from UAE in France.
DeepSeek, in particular, has shown impressive results in recent benchmarks. According to their technical report, DeepSeek has demonstrated superior performance in coding tasks and mathematical reasoning compared to other open-source models. Their 67B parameter model has achieved competitive results against GPT-4 in several benchmarks, making it an attractive option for developers looking for powerful AI capabilities in their applications.
DeepSeek’s impressive performance in benchmarks suggests a promising future for open-source AI models. As these models continue to improve, developers will have more options for implementing AI capabilities in their applications without being locked into proprietary solutions.
The ability to self-host these models or use them through APIs provides flexibility in deployment options, which is particularly important for applications with specific privacy or regulatory requirements
Integrating Deepseek AI:
The best part about Deepseek AI is that they have used the same open ai package to integrate it into their app, by just having a token from their website.
In this tutorial, we go more into using their API endpoints to do calls. this way It will be easier to integrate it into React, Nextjs, Nodejs, and React Native app. depending on the need.
you can check the official documentation here
Get a token from their website
go to https://platform.deepseek.com/ and create an account and your own token
Define API endpoints
const DEEPSEEK_API_URL= https://api.deepseek.com/chat/completions
Add your chat Completion logic
define your fetch using the API request package < Axios, or Fetch,..>
Example:
const fetchDeepseekAPI=(userMessage:string)=>{
try {
const response = await axios.post(
DEEPSEEK_API_URL,
{
model: 'deepseek-chat',
messages: [{role: 'user', content: userMessage}],
},
{
headers: {
Authorization: `Bearer ${DEEPSEEK_API_KEY}`,
'Content-Type': 'application/json',
},
},
);
// Process response and update chat
} catch (error) {
console.error('DeepSeek API Error:', error);
}
}use the API call inside your app, an example of the code using React Native:
import React, {useState, useCallback, useEffect} from 'react'; import {StyleSheet, View} from 'react-native'; import { GiftedChat, InputToolbar, Send, IMessage, SendProps, InputToolbarProps, } from 'react-native-gifted-chat'; import {useSafeAreaInsets} from 'react-native-safe-area-context'; import axios from 'axios'; // Import Axios for API calls import {ActivityIndicator, Icon} from 'react-native-paper'; const PROFILE_IMAGE = 'https://media.licdn.com/dms/image/v2/D4E03AQGf6Ev9buJRbA/profile-displayphoto-shrink_800_800/profile-displayphoto-shrink_800_800/0/1675786659686?e=1744243200&v=beta&t=IC8rTHOhFdNrI8_n1fzgbn60yNaSsf2YzWtAM2iJSeo'; const DEEPSEEK_API_URL = 'https://api.deepseek.com/v1/chat/completions'; const API_KEY = 'put your api key here'; // Replace with your DeepSeek API key export default function ChatGPTConversationList() { const [messages, setMessages] = useState<IMessage[]>([]); const [loading, setIsLoading] = useState(false); const [text, setText] = useState(''); const insets = useSafeAreaInsets(); useEffect(() => { setMessages([ { _id: 0, system: true, text: 'Type your question or share what’s on your mind…', createdAt: new Date(), user: { _id: 0, name: 'DeepSeek', avatar: 'https://cdn.deepseek.com/platform/favicon.png', }, }, ]); }, []); const sendMessageToDeepSeek = async (userMessage: IMessage) => { setIsLoading(true); try { const response = await axios.post( DEEPSEEK_API_URL, { model: 'deepseek-chat', messages: [{role: 'user', content: userMessage}], }, { headers: { Authorization: `Bearer ${API_KEY}`, 'Content-Type': 'application/json', }, }, ); const botReply = response.data.choices[0].message.content; const newBotMessage = { _id: Math.random().toString(36).substring(7), text: botReply, createdAt: new Date(), user: { _id: 0, name: 'DeepSeek', avatar: 'https://cdn.deepseek.com/platform/favicon.png', }, }; setMessages(prevMessages => GiftedChat.append(prevMessages, [newBotMessage]), ); } catch (error) { console.error('DeepSeek API Error:', error); } finally { setIsLoading(false); } }; const onSend = useCallback((messages: IMessage[]) => { setMessages(prevMessages => GiftedChat.append(prevMessages, messages)); sendMessageToDeepSeek(messages[0]); // Send message to DeepSeek API }, []); const renderInputToolbar = (props: InputToolbarProps<IMessage>) => ( <InputToolbar {...props} containerStyle={{backgroundColor: '#f0f0f0'}} /> ); const renderFooter = useCallback(() => { if (loading) { return ( <View> <ActivityIndicator size="large" color="#5BC0EB" /> </View> ); } return null; }, [loading]); const renderSend = useCallback((props: SendProps<IMessage>) => { return ( <Send {...props}> <View style={styles.sendButton}> <Icon source="send" color="blue" size={30} /> </View> </Send> ); }, []); const renderScrollToBottom = useCallback(() => { return <Icon source="chevron-down" size={36} color="#5BC0EB" />; }, []); return ( <> <GiftedChat messages={messages} onSend={messages => onSend(messages)} onInputTextChanged={setText} bottomOffset={insets.bottom} renderSend={renderSend} renderInputToolbar={renderInputToolbar} renderChatFooter={renderFooter} scrollToBottomComponent={renderScrollToBottom} user={{ _id: '1', name: 'Malik', avatar: PROFILE_IMAGE, }} showUserAvatar alwaysShowSend scrollToBottom /> </> ); } const styles = StyleSheet.create({ composer: { borderRadius: 18, borderWidth: 1, borderColor: 'grey', paddingHorizontal: 10, paddingTop: 8, fontSize: 16, marginVertical: 4, }, sendButton: { justifyContent: 'center', alignItems: 'center', marginRight: 10, }, sendIcon: { width: 44, height: 44, borderRadius: 22, backgroundColor: '#5BC0EB', }, });you can check the code here: https://reactnativetemplates.com/screensCode/19
Best Practices and Considerations
API Key Security: Never expose your API key in the client-side code. Use environment variables or a backend service to secure your credentials.
Error Handling: Implement robust error handling for API calls and network issues.
Loading States: Provide visual feedback during API calls using loading indicators.
Message Persistence: Consider implementing local storage to persist chat history.
Conclusion:
What is the Future of Generative AI in Mobile applications?
The recent surge in Generative AI and AI agents seems heavily focused on web and desktop platforms. While some mobile apps offer wrappers for services like ChatGPT or, like Apple Intelligence, integrate AI directly, the future of AI in mobile app development remains unclear.
One theory suggests lightweight, mobile-friendly AI models are the answer. Meta’s work on a package for running AI models locally in React Native ( I will write an article to bring more details about it later) supports this idea.
Others believe Generative AI wrappers will drive app development, but I question their revenue potential. These apps often rely on the same underlying model's users can access directly. For example: — Consider calorie tracking: I can achieve this with ChatGPT, likely at a lower cost since I’m already using ChatGPT for other tasks. Why pay for a dedicated app when a broader platform can handle it?
Other suggests a hybrid approach, with lightweight AI models locally, and heavy computation sent through API
The landscape is still evolving. I’d love to hear your thoughts and predictions on the future of AI in mobile. Share your assumptions and ideas in the comments below.

