How can I change from OpenAI to ChatOpenAI in langchain and Flask?

2024-09-02 13:52

本文主要是介绍How can I change from OpenAI to ChatOpenAI in langchain and Flask?,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

题意:“在 LangChain 和 Flask 中,如何将 OpenAI 更改为 ChatOpenAI?”

问题背景:

This is an implementation based on langchain and flask and refers to an implementation to be able to stream responses from the OpenAI server in langchain to a page with javascript that can show the streamed response.

“这是一个基于 LangChain 和 Flask 的实现,用于将 OpenAI 服务器的响应流式传输到一个带有 JavaScript 的页面,该页面可以显示流式响应。”

I tried all ways to modify the code below to replace the langchain library from openai to chatopenai without success, i upload below both implementations (the one with openai working) and the one chatopenai with error. thank you to all the community and those who can help me to understand the problem, it would be very helpful if you could also show me how to solve it since I have been trying for days and the error it shows has really no significance.

“我尝试了所有方法修改下面的代码,将 LangChain 库从 OpenAI 替换为 ChatOpenAI,但没有成功。下面上传了两个实现(一个是使用 OpenAI 正常工作的版本,另一个是带有错误的 ChatOpenAI 版本)。感谢所有社区成员以及能够帮助我理解问题的人。如果你们能告诉我如何解决这个问题,那将非常有帮助,因为我已经尝试了好几天,而显示的错误并没有什么实际意义。”

Code version with library that works but reports as deprecated:

“这是使用库的代码版本,它可以正常工作但报告为已弃用:”

from flask import Flask, Response
import threading
import queuefrom langchain.llms import OpenAI
from langchain.callbacks.base import BaseCallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandlerapp = Flask(__name__)@app.route('/')
def index():return Response('''<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><div id="output"></div><script>
const outputEl = document.getElementById('output');(async function() {try {const controller = new AbortController();const signal = controller.signal;const timeout = 120000; // Imposta il timeout su 120 secondisetTimeout(() => controller.abort(), timeout);const response = await fetch('/chain', {method: 'POST', signal});const reader = response.body.getReader();const decoder = new TextDecoder();let buffer = '';while (true) {const { done, value } = await reader.read();if (done) { break; }const text = decoder.decode(value, {stream: true});outputEl.innerHTML += text;}} catch (err) {console.error(err);}
})();</script>
</body>
</html>''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def llm_thread(g, prompt):try:llm = OpenAI(model_name="gpt-4",verbose=True,streaming=True,callback_manager=BaseCallbackManager([ChainStreamHandler(g)]),temperature=0.7,)llm(prompt)finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain("Create a poem about the meaning of life \n\n"), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Version with error (OpenAI replaced with ChatOpenAI)“

这是带有错误的版本(将 OpenAI 替换为 ChatOpenAI):”

import threading
import queuefrom langchain.chat_models import ChatOpenAI
from langchain.callbacks.base import BaseCallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandlerapp = Flask(__name__)@app.route('/')
def index():return Response('''<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><div id="output"></div><script>
const outputEl = document.getElementById('output');(async function() {try {const controller = new AbortController();const signal = controller.signal;const timeout = 120000; // Imposta il timeout su 120 secondisetTimeout(() => controller.abort(), timeout);const response = await fetch('/chain', {method: 'POST', signal});const reader = response.body.getReader();const decoder = new TextDecoder();let buffer = '';while (true) {const { done, value } = await reader.read();if (done) { break; }const text = decoder.decode(value, {stream: true});outputEl.innerHTML += text;}} catch (err) {console.error(err);}
})();</script>
</body>
</html>''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def on_chat_model_start(self, token: str):print("started")def llm_thread(g, prompt):try:llm = ChatOpenAI(model_name="gpt-4",verbose=True,streaming=True,callback_manager=BaseCallbackManager([ChainStreamHandler(g)]),temperature=0.7,)llm(prompt)finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain("parlami dei 5 modi di dire in inglese che gli italiani conoscono meno \n\n"), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Error showing the console at startup and at the time I enter the web page:

“启动时和进入网页时控制台显示的错误:”

Error in ChainStreamHandler.on_chat_model_start callback: ChainStreamHandler.on_chat_model_start() got an unexpected keyword argument 'run_id'
Exception in thread Thread-4 (llm_thread):
127.0.0.1 - - [09/Sep/2023 18:09:29] "POST /chain HTTP/1.1" 200 -
Traceback (most recent call last):File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 300, in _handle_eventgetattr(handler, event_name)(*args, **kwargs)File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\base.py", line 168, in on_chat_model_startraise NotImplementedError(
NotImplementedError: StdOutCallbackHandler does not implement `on_chat_model_start`During handling of the above exception, another exception occurred:Traceback (most recent call last):File "C:\Users\user22\AppData\Local\Programs\Python\Python311\Lib\threading.py", line 1038, in _bootstrap_inner    self.run()File "C:\Users\user22\AppData\Local\Programs\Python\Python311\Lib\threading.py", line 975, in runself._target(*self._args, **self._kwargs)File "c:\Users\user22\Desktop\Work\TESTPROJ\streamresp.py", line 90, in llm_threadllm(prompt)File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\chat_models\base.py", line 552, in __call__generation = self.generate(^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\chat_models\base.py", line 293, in generaterun_managers = callback_manager.on_chat_model_start(^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 1112, in on_chat_model_start_handle_event(File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 304, in _handle_eventmessage_strings = [get_buffer_string(m) for m in args[1]]^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 304, in <listcomp>message_strings = [get_buffer_string(m) for m in args[1]]^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\schema\messages.py", line 52, in get_buffer_stringraise ValueError(f"Got unsupported message type: {m}")
ValueError: Got unsupported message type: p

thank you very much for the support!

“非常感谢您的支持!”

问题解决:

Thanks to python273 user on github I've resolved:

“感谢 GitHub 上的用户 python273,我已经解决了这个问题。”

import os
os.environ["OPENAI_API_KEY"] = ""from flask import Flask, Response, request
import threading
import queuefrom langchain.chat_models import ChatOpenAI
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.schema import AIMessage, HumanMessage, SystemMessageapp = Flask(__name__)@app.route('/')
def index():# just for the example, html is included directly, move to .html filereturn Response('''
<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><form id="form"><input name="prompt" value="write a short koan story about seeing beyond"/><input type="submit"/></form><div id="output"></div><script>const formEl = document.getElementById('form');const outputEl = document.getElementById('output');let aborter = new AbortController();async function run() {aborter.abort();  // cancel previous requestoutputEl.innerText = '';aborter = new AbortController();const prompt = new FormData(formEl).get('prompt');try {const response = await fetch('/chain', {signal: aborter.signal,method: 'POST',headers: {'Content-Type': 'application/json'},body: JSON.stringify({prompt}),});const reader = response.body.getReader();const decoder = new TextDecoder();while (true) {const { done, value } = await reader.read();if (done) { break; }const decoded = decoder.decode(value, {stream: true});outputEl.innerText += decoded;}} catch (err) {console.error(err);}}run();  // run on initial promptformEl.addEventListener('submit', function(event) {event.preventDefault();run();});</script>
</body>
</html>
''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def llm_thread(g, prompt):try:chat = ChatOpenAI(verbose=True,streaming=True,callbacks=[ChainStreamHandler(g)],temperature=0.7,)chat([HumanMessage(content=prompt)])finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain(request.json['prompt']), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Link to the original reply: https://gist.github.com/python273/563177b3ad5b9f74c0f8f3299ec13850

“原始回复的链接: https://gist.github.com/python273/563177b3ad5b9f74c0f8f3299ec13850”

这篇关于How can I change from OpenAI to ChatOpenAI in langchain and Flask?的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/1130139

相关文章

flask库中sessions.py的使用小结

《flask库中sessions.py的使用小结》在Flask中Session是一种用于在不同请求之间存储用户数据的机制,Session默认是基于客户端Cookie的,但数据会经过加密签名,防止篡改,... 目录1. Flask Session 的基本使用(1) 启用 Session(2) 存储和读取 Se

Python Web框架Flask、Streamlit、FastAPI示例详解

《PythonWeb框架Flask、Streamlit、FastAPI示例详解》本文对比分析了Flask、Streamlit和FastAPI三大PythonWeb框架:Flask轻量灵活适合传统应用... 目录概述Flask详解Flask简介安装和基础配置核心概念路由和视图模板系统数据库集成实际示例Stre

从入门到精通详解LangChain加载HTML内容的全攻略

《从入门到精通详解LangChain加载HTML内容的全攻略》这篇文章主要为大家详细介绍了如何用LangChain优雅地处理HTML内容,文中的示例代码讲解详细,感兴趣的小伙伴可以跟随小编一起学习一下... 目录引言:当大语言模型遇见html一、HTML加载器为什么需要专门的HTML加载器核心加载器对比表二

使用Docker构建Python Flask程序的详细教程

《使用Docker构建PythonFlask程序的详细教程》在当今的软件开发领域,容器化技术正变得越来越流行,而Docker无疑是其中的佼佼者,本文我们就来聊聊如何使用Docker构建一个简单的Py... 目录引言一、准备工作二、创建 Flask 应用程序三、创建 dockerfile四、构建 Docker

python web 开发之Flask中间件与请求处理钩子的最佳实践

《pythonweb开发之Flask中间件与请求处理钩子的最佳实践》Flask作为轻量级Web框架,提供了灵活的请求处理机制,中间件和请求钩子允许开发者在请求处理的不同阶段插入自定义逻辑,实现诸如... 目录Flask中间件与请求处理钩子完全指南1. 引言2. 请求处理生命周期概述3. 请求钩子详解3.1

Python Flask 库及应用场景

《PythonFlask库及应用场景》Flask是Python生态中​轻量级且高度灵活的Web开发框架,基于WerkzeugWSGI工具库和Jinja2模板引擎构建,下面给大家介绍PythonFl... 目录一、Flask 库简介二、核心组件与架构三、常用函数与核心操作 ​1. 基础应用搭建​2. 路由与参

Python中Flask模板的使用与高级技巧详解

《Python中Flask模板的使用与高级技巧详解》在Web开发中,直接将HTML代码写在Python文件中会导致诸多问题,Flask内置了Jinja2模板引擎,完美解决了这些问题,下面我们就来看看F... 目录一、模板渲染基础1.1 为什么需要模板引擎1.2 第一个模板渲染示例1.3 模板渲染原理二、模板

Python 安装和配置flask, flask_cors的图文教程

《Python安装和配置flask,flask_cors的图文教程》:本文主要介绍Python安装和配置flask,flask_cors的图文教程,本文通过图文并茂的形式给大家介绍的非常详细,... 目录一.python安装:二,配置环境变量,三:检查Python安装和环境变量,四:安装flask和flas

CSS will-change 属性示例详解

《CSSwill-change属性示例详解》will-change是一个CSS属性,用于告诉浏览器某个元素在未来可能会发生哪些变化,本文给大家介绍CSSwill-change属性详解,感... will-change 是一个 css 属性,用于告诉浏览器某个元素在未来可能会发生哪些变化。这可以帮助浏览器优化

基于Flask框架添加多个AI模型的API并进行交互

《基于Flask框架添加多个AI模型的API并进行交互》:本文主要介绍如何基于Flask框架开发AI模型API管理系统,允许用户添加、删除不同AI模型的API密钥,感兴趣的可以了解下... 目录1. 概述2. 后端代码说明2.1 依赖库导入2.2 应用初始化2.3 API 存储字典2.4 路由函数2.5 应