Merge branch 'feature'

main
ChenZhaoYu 2 years ago
commit ba41015df8

@ -1,6 +1,6 @@
**/node_modules
*/node_modules
node_modules
Dockerfile
.git
.husky
.github
.vscode
.*
*/.*

@ -4,7 +4,11 @@ FROM node:lts-alpine AS builder
COPY ./ /app
WORKDIR /app
RUN npm install pnpm -g && pnpm install && pnpm run build
RUN apk add --no-cache git \
&& npm install pnpm -g \
&& pnpm install \
&& pnpm run build \
&& rm -rf /root/.npm /root/.pnpm-store /usr/local/share/.cache /tmp/*
# service
FROM node:lts-alpine
@ -13,8 +17,12 @@ COPY /service /app
COPY --from=builder /app/dist /app/public
WORKDIR /app
RUN npm install pnpm -g && pnpm install
RUN apk add --no-cache git \
&& npm install pnpm -g \
&& pnpm install --only=production \
&& rm -rf /root/.npm /root/.pnpm-store /usr/local/share/.cache /tmp/*
EXPOSE 3002
CMD ["pnpm", "run", "start"]
CMD ["pnpm", "run", "start"]

@ -168,6 +168,7 @@ pnpm dev
- `OPENAI_API_KEY` one of two
- `OPENAI_ACCESS_TOKEN` one of two, `OPENAI_API_KEY` takes precedence when both are present
- `OPENAI_API_BASE_URL` optional, available when `OPENAI_API_KEY` is set
- `OPENAI_API_Model` optional, available when `OPENAI_API_KEY` is set
- `API_REVERSE_PROXY` optional, available when `OPENAI_ACCESS_TOKEN` is set [Reference](#introduction)
- `AUTH_SECRET_KEY` Access Passwordoptional
- `TIMEOUT_MS` timeout, in milliseconds, optional
@ -210,6 +211,8 @@ services:
OPENAI_ACCESS_TOKEN: xxxxxx
# api interface url, optional, available when OPENAI_API_KEY is set
OPENAI_API_BASE_URL: xxxx
# api model, optional, available when OPENAI_API_KEY is set
OPENAI_API_Model: xxxx
# reverse proxy, optional
API_REVERSE_PROXY: xxx
# access passwordoptional
@ -222,6 +225,7 @@ services:
SOCKS_PROXY_PORT: xxxx
```
The `OPENAI_API_BASE_URL` is optional and only used when setting the `OPENAI_API_KEY`.
The `OPENAI_API_Model` is optional and only used when setting the `OPENAI_API_KEY`.
### Deployment with Railway
@ -237,6 +241,7 @@ The `OPENAI_API_BASE_URL` is optional and only used when setting the `OPENAI_API
| `OPENAI_API_KEY` | Optional | Required for `OpenAI API`. `apiKey` can be obtained from [here](https://platform.openai.com/overview). |
| `OPENAI_ACCESS_TOKEN`| Optional | Required for `Web API`. `accessToken` can be obtained from [here](https://chat.openai.com/api/auth/session).|
| `OPENAI_API_BASE_URL` | Optional, only for `OpenAI API` | API endpoint. |
| `OPENAI_API_Model` | Optional, only for `OpenAI API` | API model. |
| `API_REVERSE_PROXY` | Optional, only for `Web API` | Reverse proxy address for `Web API`. [Details](https://github.com/transitive-bullshit/chatgpt-api#reverse-proxy) |
| `SOCKS_PROXY_HOST` | Optional, effective with `SOCKS_PROXY_PORT` | Socks proxy. |
| `SOCKS_PROXY_PORT` | Optional, effective with `SOCKS_PROXY_HOST` | Socks proxy port. |

@ -166,6 +166,7 @@ pnpm dev
- `OPENAI_API_KEY` 二选一
- `OPENAI_ACCESS_TOKEN` 二选一,同时存在时,`OPENAI_API_KEY` 优先
- `OPENAI_API_BASE_URL` 可选,设置 `OPENAI_API_KEY` 时可用
- `OPENAI_API_Model` 可选,设置 `OPENAI_API_KEY` 时可用
- `API_REVERSE_PROXY` 可选,设置 `OPENAI_ACCESS_TOKEN` 时可用 [参考](#介绍)
- `AUTH_SECRET_KEY` 访问权限密钥,可选
- `TIMEOUT_MS` 超时,单位毫秒,可选
@ -208,6 +209,8 @@ services:
OPENAI_ACCESS_TOKEN: xxxxxx
# API接口地址可选设置 OPENAI_API_KEY 时可用
OPENAI_API_BASE_URL: xxxx
# API模型可选设置 OPENAI_API_KEY 时可用
OPENAI_API_Model: xxxx
# 反向代理,可选
API_REVERSE_PROXY: xxx
# 访问权限密钥,可选
@ -220,6 +223,7 @@ services:
SOCKS_PROXY_PORT: xxxx
```
- `OPENAI_API_BASE_URL` 可选,设置 `OPENAI_API_KEY` 时可用
- `OPENAI_API_Model` 可选,设置 `OPENAI_API_KEY` 时可用
### 使用 Railway 部署
[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/new/template/yytmgc)
@ -234,6 +238,7 @@ services:
| `OPENAI_API_KEY` | `OpenAI API` 二选一 | 使用 `OpenAI API` 所需的 `apiKey` [(获取 apiKey)](https://platform.openai.com/overview) |
| `OPENAI_ACCESS_TOKEN` | `Web API` 二选一 | 使用 `Web API` 所需的 `accessToken` [(获取 accessToken)](https://chat.openai.com/api/auth/session) |
| `OPENAI_API_BASE_URL` | 可选,`OpenAI API` 时可用 | `API`接口地址 |
| `OPENAI_API_Model` | 可选,`OpenAI API` 时可用 | `API`模型 |
| `API_REVERSE_PROXY` | 可选,`Web API` 时可用 | `Web API` 反向代理地址 [详情](https://github.com/transitive-bullshit/chatgpt-api#reverse-proxy) |
| `SOCKS_PROXY_HOST` | 可选,和 `SOCKS_PROXY_PORT` 一起时生效 | Socks代理 |
| `SOCKS_PROXY_PORT` | 可选,和 `SOCKS_PROXY_HOST` 一起时生效 | Socks代理端口 |

@ -12,6 +12,8 @@ services:
OPENAI_ACCESS_TOKEN: xxxxxx
# API接口地址可选设置 OPENAI_API_KEY 时可用
OPENAI_API_BASE_URL: xxxx
# API模型可选设置 OPENAI_API_KEY 时可用
OPENAI_API_Model: xxxx
# 反向代理,可选
API_REVERSE_PROXY: xxx
# 访问权限密钥,可选

@ -7,6 +7,9 @@ OPENAI_ACCESS_TOKEN=
# OpenAI API Base URL - https://api.openai.com
OPENAI_API_BASE_URL=
# OpenAI API Model - https://platform.openai.com/docs/models
OPENAI_API_Model=
# Reverse Proxy
API_REVERSE_PROXY=

@ -36,7 +36,7 @@ let api: ChatGPTAPI | ChatGPTUnofficialProxyAPI
const options: ChatGPTAPIOptions = {
apiKey: process.env.OPENAI_API_KEY,
completionParams: {
model: 'gpt-3.5-turbo',
model: process.env.OPENAI_API_Model ?? 'gpt-3.5-turbo',
},
debug: false,
}

@ -27,6 +27,9 @@ export default {
exportImageConfirm: 'Are you sure to export this chat to png?',
exportSuccess: 'Export Success',
exportFailed: 'Export Failed',
usingContext: 'Context Mode',
turnOnContext: 'In the current mode, sending messages will carry previous chat records.',
turnOffContext: 'In the current mode, sending messages will not carry previous chat records.',
deleteMessage: 'Delete Message',
deleteMessageConfirm: 'Are you sure to delete this message?',
deleteHistoryConfirm: 'Are you sure to clear this history?',

@ -27,6 +27,9 @@ export default {
exportImageConfirm: '是否将会话保存为图片?',
exportSuccess: '保存成功',
exportFailed: '保存失败',
usingContext: '上下文模式',
turnOnContext: '当前模式下, 发送消息会携带之前的聊天记录',
turnOffContext: '当前模式下, 发送消息不会携带之前的聊天记录',
deleteMessage: '删除消息',
deleteMessageConfirm: '是否删除此消息?',
deleteHistoryConfirm: '确定删除此记录?',

@ -27,6 +27,9 @@ export default {
exportImageConfirm: '是否將對話儲存為圖片?',
exportSuccess: '儲存成功',
exportFailed: '儲存失敗',
usingContext: '上下文模式',
turnOnContext: '在當前模式下, 發送訊息會攜帶之前的聊天記錄。',
turnOffContext: '在當前模式下, 發送訊息不會攜帶之前的聊天記錄。',
deleteMessage: '刪除訊息',
deleteMessageConfirm: '是否刪除此訊息?',
deleteHistoryConfirm: '確定刪除此紀錄?',

@ -33,6 +33,7 @@ const conversationList = computed(() => dataSources.value.filter(item => (!item.
const prompt = ref<string>('')
const loading = ref<boolean>(false)
const usingContext = ref<boolean>(true)
function handleSubmit() {
onConversation()
@ -68,7 +69,7 @@ async function onConversation() {
let options: Chat.ConversationRequest = {}
const lastContext = conversationList.value[conversationList.value.length - 1]?.conversationOptions
if (lastContext)
if (lastContext && usingContext.value)
options = { ...lastContext }
addChat(
@ -364,6 +365,24 @@ function handleStop() {
}
}
function toggleUsingContext() {
usingContext.value = !usingContext.value
if (usingContext.value) {
dialog.info({
title: t('chat.usingContext'),
content: t('chat.turnOnContext'),
positiveText: t('common.yes'),
})
}
else {
dialog.info({
title: t('chat.usingContext'),
content: t('chat.turnOffContext'),
positiveText: t('common.yes'),
})
}
}
const placeholder = computed(() => {
if (isMobile.value)
return t('chat.placeholderMobile')
@ -455,6 +474,11 @@ onUnmounted(() => {
<SvgIcon icon="ri:download-2-line" />
</span>
</HoverButton>
<HoverButton @click="toggleUsingContext">
<span class="text-xl" :class="{ 'text-[#4b9e5f]': usingContext, 'text-[#a8071a]': !usingContext }">
<SvgIcon icon="ri:chat-history-line" />
</span>
</HoverButton>
<NInput
v-model:value="prompt"
type="textarea"

Loading…
Cancel
Save