mirror of
https://github.com/LiteyukiStudio/nonebot-plugin-marshoai.git
synced 2025-02-07 05:56:11 +08:00
Deploying to docs from @ LiteyukiStudio/nonebot-plugin-marshoai@b75a47e1e8 🚀
This commit is contained in:
parent
e869d410ee
commit
972c46ecda
@ -1 +0,0 @@
|
||||
import{_ as i,c as a,ae as t,o as e}from"./chunks/framework.BzDBnRMZ.js";const c=JSON.parse('{"title":"使用","description":"","frontmatter":{"title":"使用"},"headers":[],"relativePath":"start/use.md","filePath":"zh/start/use.md","lastUpdated":1737825849000}'),n={name:"start/use.md"};function l(h,s,p,k,o,d){return e(),a("div",null,s[0]||(s[0]=[t("",14)]))}const u=i(n,[["render",l]]);export{c as __pageData,u as default};
|
@ -1,5 +1,7 @@
|
||||
import{_ as i,c as a,ae as t,o as e}from"./chunks/framework.BzDBnRMZ.js";const c=JSON.parse('{"title":"使用","description":"","frontmatter":{"title":"使用"},"headers":[],"relativePath":"start/use.md","filePath":"zh/start/use.md","lastUpdated":1737825849000}'),n={name:"start/use.md"};function l(h,s,p,k,o,d){return e(),a("div",null,s[0]||(s[0]=[t(`<h1 id="安装" tabindex="-1">安装 <a class="header-anchor" href="#安装" aria-label="Permalink to "安装""></a></h1><ul><li>请查看 <a href="./install">安装文档</a></li></ul><h1 id="使用" tabindex="-1">使用 <a class="header-anchor" href="#使用" aria-label="Permalink to "使用""></a></h1><h3 id="api-部署" tabindex="-1">API 部署 <a class="header-anchor" href="#api-部署" aria-label="Permalink to "API 部署""></a></h3><p>本插件推荐使用 <a href="https://github.com/songquanpeng/one-api" target="_blank" rel="noreferrer">one-api</a> 作为中转以调用 LLM。</p><h3 id="配置调整" tabindex="-1">配置调整 <a class="header-anchor" href="#配置调整" aria-label="Permalink to "配置调整""></a></h3><p>本插件理论上可兼容大部分可通过 OpenAI 兼容 API 调用的 LLM,部分模型可能需要调整插件配置。</p><p>例如:</p><ul><li>对于不支持 Function Call 的模型(Cohere Command R等):<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ENABLE_PLUGINS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">false</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ENABLE_TOOLS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">false</span></span></code></pre></div></li><li>对于支持图片处理的模型(hunyuan-vision等):<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ADDITIONAL_IMAGE_MODELS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">[</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"hunyuan-vision"</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">]</span></span></code></pre></div></li></ul><h3 id="使用-vllm-部署本地模型" tabindex="-1">使用 vLLM 部署本地模型 <a class="header-anchor" href="#使用-vllm-部署本地模型" aria-label="Permalink to "使用 vLLM 部署本地模型""></a></h3><p>你可使用 vLLM 部署一个本地 LLM,并使用 OpenAI 兼容 API 调用。<br> 本文档以 Qwen2.5-7B-Instruct-GPTQ-Int4 模型及 <a href="https://github.com/Moemu/Muice-Chatbot" target="_blank" rel="noreferrer">Muice-Chatbot</a> 提供的 LoRA 微调模型为例,并假设你的系统及硬件可运行 vLLM。</p><div class="warning custom-block"><p class="custom-block-title">WARNING</p><p>vLLM 仅支持 Linux 系统。</p></div><ol><li>安装 vLLM<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">pip</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> install</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> vllm</span></span></code></pre></div></li><li>下载 Muice-Chatbot 提供的 LoRA 微调模型<br> 前往 Muice-Chatbot 的 <a href="https://github.com/Moemu/Muice-Chatbot/releases" target="_blank" rel="noreferrer">Releases</a> 下载模型文件。此处以<code>2.7.1</code>版本的模型为例。<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">wget</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> https://github.com/Moemu/Muice-Chatbot/releases/download/1.4/Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4.7z</span></span></code></pre></div></li><li>解压模型文件<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">7z</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> x</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4.7z</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> -oMuice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4</span></span></code></pre></div></li><li>启动 vLLM<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">vllm</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> serve</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> Qwen/Qwen2.5-7B-Instruct-GPTQ-Int4</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> \\</span></span>
|
||||
import{_ as i,c as a,ae as e,o as t}from"./chunks/framework.BzDBnRMZ.js";const c=JSON.parse('{"title":"使用","description":"","frontmatter":{"title":"使用"},"headers":[],"relativePath":"start/use.md","filePath":"zh/start/use.md","lastUpdated":1738327524000}'),n={name:"start/use.md"};function l(p,s,h,k,o,d){return t(),a("div",null,s[0]||(s[0]=[e(`<h1 id="安装" tabindex="-1">安装 <a class="header-anchor" href="#安装" aria-label="Permalink to "安装""></a></h1><ul><li>请查看 <a href="./install">安装文档</a></li></ul><h1 id="使用" tabindex="-1">使用 <a class="header-anchor" href="#使用" aria-label="Permalink to "使用""></a></h1><h3 id="api-部署" tabindex="-1">API 部署 <a class="header-anchor" href="#api-部署" aria-label="Permalink to "API 部署""></a></h3><p>本插件推荐使用 <a href="https://github.com/songquanpeng/one-api" target="_blank" rel="noreferrer">one-api</a> 作为中转以调用 LLM。</p><h3 id="配置调整" tabindex="-1">配置调整 <a class="header-anchor" href="#配置调整" aria-label="Permalink to "配置调整""></a></h3><p>本插件理论上可兼容大部分可通过 OpenAI 兼容 API 调用的 LLM,部分模型可能需要调整插件配置。</p><p>例如:</p><ul><li>对于不支持 Function Call 的模型(Cohere Command R,DeepSeek-R1等):<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ENABLE_PLUGINS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">false</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ENABLE_TOOLS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">false</span></span></code></pre></div></li><li>对于支持图片处理的模型(hunyuan-vision等):<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ADDITIONAL_IMAGE_MODELS</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">[</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"hunyuan-vision"</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">]</span></span></code></pre></div></li></ul><h3 id="使用-deepseek-r1-模型" tabindex="-1">使用 DeepSeek-R1 模型 <a class="header-anchor" href="#使用-deepseek-r1-模型" aria-label="Permalink to "使用 DeepSeek-R1 模型""></a></h3><p>MarshoAI 兼容 DeepSeek-R1 模型,你可通过以下步骤来使用:</p><ol><li>获取 API Key<br> 前往<a href="https://platform.deepseek.com/api_keys" target="_blank" rel="noreferrer">此处</a>获取 API Key。</li><li>配置插件<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_TOKEN</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"<你的 API Key>"</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_AZURE_ENDPOINT</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"https://api.deepseek.com"</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_DEFAULT_MODEL</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"deepseek-reasoner"</span></span></code></pre></div>你可修改 <code>MARSHOAI_DEFAULT_MODEL</code> 为 其它模型名来调用其它 DeepSeek 模型。<div class="tip custom-block"><p class="custom-block-title">TIP</p><p>如果使用 one-api 作为中转,你可将 <code>MARSHOAI_AZURE_ENDPOINT</code> 设置为 one-api 的地址,将 <code>MARSHOAI_TOKEN</code> 设为 one-api 配置的令牌,在 one-api 中添加 DeepSeek 渠道。</p></div></li></ol><h3 id="使用-vllm-部署本地模型" tabindex="-1">使用 vLLM 部署本地模型 <a class="header-anchor" href="#使用-vllm-部署本地模型" aria-label="Permalink to "使用 vLLM 部署本地模型""></a></h3><p>你可使用 vLLM 部署一个本地 LLM,并使用 OpenAI 兼容 API 调用。<br> 本文档以 Qwen2.5-7B-Instruct-GPTQ-Int4 模型及 <a href="https://github.com/Moemu/Muice-Chatbot" target="_blank" rel="noreferrer">Muice-Chatbot</a> 提供的 LoRA 微调模型为例,并假设你的系统及硬件可运行 vLLM。</p><div class="warning custom-block"><p class="custom-block-title">WARNING</p><p>vLLM 仅支持 Linux 系统。</p></div><ol><li>安装 vLLM<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">pip</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> install</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> vllm</span></span></code></pre></div></li><li>下载 Muice-Chatbot 提供的 LoRA 微调模型<br> 前往 Muice-Chatbot 的 <a href="https://github.com/Moemu/Muice-Chatbot/releases" target="_blank" rel="noreferrer">Releases</a> 下载模型文件。此处以<code>2.7.1</code>版本的模型为例。<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">wget</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> https://github.com/Moemu/Muice-Chatbot/releases/download/1.4/Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4.7z</span></span></code></pre></div></li><li>解压模型文件<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">7z</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> x</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4.7z</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> -oMuice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4</span></span></code></pre></div></li><li>启动 vLLM<div class="language-bash vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">bash</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#6F42C1;--shiki-dark:#B392F0;">vllm</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> serve</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> Qwen/Qwen2.5-7B-Instruct-GPTQ-Int4</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> \\</span></span>
|
||||
<span class="line"><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> --enable-lora</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> \\</span></span>
|
||||
<span class="line"><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> --lora-modules</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;"> '{"name": "muice-lora", "path": "/root/Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4", "base_model_name": "Qwen/Qwen2.5-7B-Instruct-GPTQ-Int4"}'</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> \\</span></span>
|
||||
<span class="line"><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> --port</span><span style="--shiki-light:#005CC5;--shiki-dark:#79B8FF;"> 6006</span></span></code></pre></div>此示例命令将在 <code>6006</code> 端口启动 vLLM,并加载 Muice-Chatbot 提供的 LoRA 微调模型,该模型位于 <code>/root/Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4</code> 目录下。</li><li>配置插件<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_AZURE_ENDPOINT</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"http://127.0.0.1:6006/v1"</span></span>
|
||||
@ -8,4 +10,4 @@ import{_ as i,c as a,ae as t,o as e}from"./chunks/framework.BzDBnRMZ.js";const c
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_DEFAULT_MODEL</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"muice-lora"</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_PROMPT</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"现在开始你是一个名为的“沐雪”的AI女孩子,开发者是“沐沐”并住在(沐沐)的机箱里。现在正在努力成为一个合格的VTuber(虚拟主播)并尝试和观众打成一片,以下是你的设定:样貌:有着一头粉白色的长发和一双明亮的大眼睛,喜欢穿日系JK或者是Lolita;喜欢的颜色:浅粉色;性格特征:纯真无邪是沐雪最基本的性格特征之一。即使面对复杂的情境,她也总能保持善良、天真之感。而且,她喜欢倾听别人倾述自己生活中发生的各种事情,在别人需要的时候,能够及时地安慰别人;语言风格:沐雪说话轻快愉悦,充满同情心,富有人情味,有时候会用俏皮话调侃自己和他人"</span></span></code></pre></div>(可选) 修改调用方式<div class="language-dotenv vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang">dotenv</span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_DEFAULT_NAME</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"muice"</span></span>
|
||||
<span class="line"><span style="--shiki-light:#E36209;--shiki-dark:#FFAB70;">MARSHOAI_ALIASES</span><span style="--shiki-light:#D73A49;--shiki-dark:#F97583;">=</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">[</span><span style="--shiki-light:#032F62;--shiki-dark:#9ECBFF;">"沐雪"</span><span style="--shiki-light:#24292E;--shiki-dark:#E1E4E8;">]</span></span></code></pre></div></li><li>测试聊天</li></ol><div class="language- vp-adaptive-theme"><button title="Copy Code" class="copy"></button><span class="lang"></span><pre class="shiki shiki-themes github-light github-dark vp-code" tabindex="0"><code><span class="line"><span>> muice 你是谁</span></span>
|
||||
<span class="line"><span>我是沐雪,我的使命是传播爱与和平。</span></span></code></pre></div>`,14)]))}const u=i(n,[["render",l]]);export{c as __pageData,u as default};
|
||||
<span class="line"><span>我是沐雪,我的使命是传播爱与和平。</span></span></code></pre></div>`,17)]))}const u=i(n,[["render",l]]);export{c as __pageData,u as default};
|
1
assets/start_use.md.HknkOvSZ.lean.js
Normal file
1
assets/start_use.md.HknkOvSZ.lean.js
Normal file
@ -0,0 +1 @@
|
||||
import{_ as i,c as a,ae as e,o as t}from"./chunks/framework.BzDBnRMZ.js";const c=JSON.parse('{"title":"使用","description":"","frontmatter":{"title":"使用"},"headers":[],"relativePath":"start/use.md","filePath":"zh/start/use.md","lastUpdated":1738327524000}'),n={name:"start/use.md"};function l(p,s,h,k,o,d){return t(),a("div",null,s[0]||(s[0]=[e("",17)]))}const u=i(n,[["render",l]]);export{c as __pageData,u as default};
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user