Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chatmodel: fix incorrect currentResponse argument #3245

Merged
merged 3 commits into from
Dec 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions gpt4all-chat/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,11 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).

## [Unreleased]

### Fixed
- Fix an incorrect value for currentResponse ([#3245](https://github.com/nomic-ai/gpt4all/pull/3245))

## [3.5.0] - 2024-12-09

### Changed
Expand Down Expand Up @@ -203,6 +208,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
- Fix several Vulkan resource management issues ([#2694](https://github.com/nomic-ai/gpt4all/pull/2694))
- Fix crash/hang when some models stop generating, by showing special tokens ([#2701](https://github.com/nomic-ai/gpt4all/pull/2701))

[Unreleased]: https://github.com/nomic-ai/gpt4all/compare/v3.5.0...HEAD
[3.5.0]: https://github.com/nomic-ai/gpt4all/compare/v3.5.0-rc2...v3.5.0
[3.5.0-rc2]: https://github.com/nomic-ai/gpt4all/compare/v3.5.0-rc1...v3.5.0-rc2
[3.5.0-rc1]: https://github.com/nomic-ai/gpt4all/compare/v3.4.2...v3.5.0-rc1
Expand Down
23 changes: 1 addition & 22 deletions gpt4all-chat/src/chatmodel.h
Original file line number Diff line number Diff line change
Expand Up @@ -193,9 +193,6 @@ class ChatModel : public QAbstractListModel
NameRole = Qt::UserRole + 1,
ValueRole,

// prompts and responses
PeerRole,

// prompts
PromptAttachmentsRole,

Expand Down Expand Up @@ -266,18 +263,6 @@ class ChatModel : public QAbstractListModel
return item->name;
case ValueRole:
return item->value;
case PeerRole:
switch (item->type()) {
using enum ChatItem::Type;
case Prompt:
case Response:
{
auto peer = getPeerUnlocked(item);
return peer ? QVariant::fromValue(**peer) : QVariant::fromValue(nullptr);
}
default:
return QVariant();
}
case PromptAttachmentsRole:
return QVariant::fromValue(item->promptAttachments);
case SourcesRole:
Expand Down Expand Up @@ -320,7 +305,6 @@ class ChatModel : public QAbstractListModel
return {
{ NameRole, "name" },
{ ValueRole, "value" },
{ PeerRole, "peer" },
{ PromptAttachmentsRole, "promptAttachments" },
{ SourcesRole, "sources" },
{ ConsolidatedSourcesRole, "consolidatedSources" },
Expand Down Expand Up @@ -362,18 +346,13 @@ class ChatModel : public QAbstractListModel
count = m_chatItems.count();
}

int promptIndex = 0;
beginInsertRows(QModelIndex(), count, count);
{
QMutexLocker locker(&m_mutex);
m_chatItems.emplace_back(ChatItem::response_tag, promptIndex);
if (auto pi = getPeerUnlocked(m_chatItems.size() - 1))
promptIndex = *pi;
m_chatItems.emplace_back(ChatItem::response_tag);
}
endInsertRows();
emit countChanged();
if (promptIndex >= 0)
emit dataChanged(createIndex(promptIndex, 0), createIndex(promptIndex, 0), {PeerRole});
}

// Used by Server to append a new conversation to the chat log.
Expand Down
Loading