An Overview of a Python Script for Automated Chat Responses using OpenAI's GPT-3

Abstraction

This Python project is designed to automate chat responses in the popular messaging application, WeChat (微信), using OpenAI’s GPT-3 model. The project leverages the power of GPT-3’s language understanding capabilities to generate contextually relevant responses to incoming messages.

Project Design

The project is structured around two main functions: the GPT function, which interacts with the GPT-3 model, and the main function, which handles the user interface and message processing in WeChat.

GPT Function

The GPT function is designed to generate responses using the GPT-3 model. It takes as input a string text representing the incoming message and a list record that maintains the conversation history. The function appends the incoming message to the conversation record, sets the OpenAI API key, and creates a completion using the OpenAI API. The completion is a response generated by the GPT-3 model based on the conversation history. The function then extracts the assistant’s message from the response, appends it to the conversation record, and returns it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
# Define a function to handle the interaction with the GPT-3 model
def GPT(text: str, record: List[dict]) -> str:
# Append the user's message to the conversation record
record.append({"role": "user", "content": text})
# Print the user's message for debugging purposes
print(text)
# Set the OpenAI API key
openai.api_key = apiKey
# Define the model ID
id = "gpt-3.5-turbo"
# Initialize a question string with the user's text
question = "###" + text + "###"
# Create a completion using the OpenAI API
response = openai.ChatCompletion.create(
model=id,
messages=record,
temperature=0.1,
top_p=0.1,
max_tokens=2048,
)
# Extract the total token usage from the response for debugging purposes
usage = response.usage["total_tokens"]
# Extract the content from the assistant's message in the response
response = response["choices"][0]["message"]["content"]
# Append the assistant's message to the conversation record
record.append({"role": "assistant", "content": response})
# Return the assistant's response
return response

Main Function

The main function is the core of the project. It initializes the conversation with a user’s template and creates a WindowControl object for the WeChat window. It then creates a ListControl object for the chat list in the window and loads reply data from a CSV file.

The function then enters a loop where it continuously checks for unread messages in the chat. When an unread message is found, it clicks on it, extracts the content of the last message, and generates a reply using the GPT function. If the conversation record has more than six messages, it trims it down to maintain a manageable size. The function then prepares the reply to be sent. If there is a reply, it sends it; if there is no reply, it sends a default message.

The script is designed to handle exceptions gracefully. If an error occurs during the execution of the main function, it prints the error and breaks the loop.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
def main():
# Initialize the conversation with the user's template
mem = [{"role": "user", "content": template}]
try:
# Create a WindowControl object for the "微信" window
wx = WindowControl(Name="微信")
print(wx)
wx.SwitchToThisWindow()
# Create a ListControl object for the "会话" list in the window
chat = wx.ListControl(Name="会话")
print("Find chat and combine:", chat)
except Exception as e:
print(f"Error: {e}")
return

# Main loop
while True:
try:
# Look for unread messages in the chat
unread = chat.TextControl(searchDepth=4)
# Wait until an unread message appears
while not unread.Exists(0):
pass
print("Find unread:", unread)
# If there is an unread message, click on it
if not unread.Name: continue
unread.Click(simulateMove=False)
# Extract the content of the last message
last_msg = wx.ListControl(Name="消息").GetChildren()[-1].Name
print("Last message:", last_msg)
# Generate a reply using the GPT-3 model
reply = GPT(last_msg, mem)
# If the conversation record has more than 6 messages, trim it down
if len(mem) > 6:
mem = mem[3:]
mem = [{"role": "user", "content": template}] + mem
print("memory size:", len(mem))
# Prepare the reply to be sent
ar = [reply]
# If there is a reply, send it
if ar:
wx.SwitchToThisWindow()
wx.SendKeys(ar[0].replace(
"{br}", "{Shift}{Enter}"), waitTime=0)
wx.SendKeys("{Enter}", waitTime=0)
wx.TextControl(SubName=ar[0][:5]).Click()
else:
# If there is no reply, send a default message
wx.SwitchToThisWindow()
wx.SendKeys("Unknown", waitTime=0)
wx.SendKeys("{Enter}", waitTime=0)
wx.TextControl(SubName=last_msg[:5]).Click()
except Exception as e:
print(f"Error: {e}")
break

Execution

The script is designed to be run directly. If the script is run directly (i.e., not imported as a module), it calls the main function and starts the automated chat response process.

Conclusion

In summary, this Python project is a sophisticated example of how to use OpenAI’s GPT-3 model to automate responses in a chat application. It demonstrates how to interact with the GPT-3 model, handle user interface elements in a chat application, and manage conversation history for context-aware responses. The project is a testament to the potential of AI in automating tasks and enhancing user experience in messaging platforms.

Complete Code

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
from typing import List
import pandas as pd
import numpy as np
from uiautomation import WindowControl
import openai

apiKey = ""

template = ""

# Define a function to handle the interaction with the GPT-3 model
def GPT(text: str, record: List[dict]) -> str:
# Append the user's message to the conversation record
record.append({"role": "user", "content": text})
# Print the user's message for debugging purposes
print(text)
# Set the OpenAI API key
openai.api_key = apiKey
# Define the model ID
id = "gpt-3.5-turbo"
# Initialize a question string with the user's text
question = "###" + text + "###"
# Create a completion using the OpenAI API
response = openai.ChatCompletion.create(
model=id,
messages=record,
temperature=0.1,
top_p=0.1,
max_tokens=2048,
)
# Extract the total token usage from the response for debugging purposes
usage = response.usage["total_tokens"]
# Extract the content from the assistant's message in the response
response = response["choices"][0]["message"]["content"]
# Append the assistant's message to the conversation record
record.append({"role": "assistant", "content": response})
# Return the assistant's response
return response


# Main function
def main():
# Initialize the conversation with the user's template
mem = [{"role": "user", "content": template}]
try:
# Create a WindowControl object for the "微信" window
wx = WindowControl(Name="微信")
print(wx)
wx.SwitchToThisWindow()
# Create a ListControl object for the "会话" list in the window
chat = wx.ListControl(Name="会话")
print("Find chat and combine:", chat)
except Exception as e:
print(f"Error: {e}")
return

# Main loop
while True:
try:
# Look for unread messages in the chat
unread = chat.TextControl(searchDepth=4)
# Wait until an unread message appears
while not unread.Exists(0):
pass
print("Find unread:", unread)
# If there is an unread message, click on it
if not unread.Name: continue
unread.Click(simulateMove=False)
# Extract the content of the last message
last_msg = wx.ListControl(Name="消息").GetChildren()[-1].Name
print("Last message:", last_msg)
# Generate a reply using the GPT-3 model
reply = GPT(last_msg, mem)
# If the conversation record has more than 6 messages, trim it down
if len(mem) > 6:
mem = mem[3:]
mem = [{"role": "user", "content": template}] + mem
print("memory size:", len(mem))
# Prepare the reply to be sent
ar = [reply]
# If there is a reply, send it
if ar:
wx.SwitchToThisWindow()
wx.SendKeys(ar[0].replace(
"{br}", "{Shift}{Enter}"), waitTime=0)
wx.SendKeys("{Enter}", waitTime=0)
wx.TextControl(SubName=ar[0][:5]).Click()
else:
# If there is no reply, send a default message
wx.SwitchToThisWindow()
wx.SendKeys("Unknown", waitTime=0)
wx.SendKeys("{Enter}", waitTime=0)
wx.TextControl(SubName=last_msg[:5]).Click()
except Exception as e:
print(f"Error: {e}")
break


# Run the main function if this script is run directly
if __name__ == "__main__":
main()