mirror of
https://github.com/hexastack/hexabot
synced 2025-06-26 18:27:28 +00:00
feat: implement nlp based blocks prioritization strategy
feat: add weight to nlp entity schema and readapt feat: remove commented obsolete code feat: restore settings feat: apply feedback fix: re-adapt unit tests feat: priority scoring re-calculation & enabling weight modification in builtin nlp entities fix: remove obsolete code feat: refine unit tests, apply mr coderabbit suggestions fix: minor refactoring feat: add nlp cache map type feat: refine builtin nlp entities weight updates feat: add more test cases and refine edge case handling feat: add weight validation in UI fix: apply feedback feat: add a penalty factor & fix unit tests feat: add documentation fix: correct syntax fix: remove stale log statement fix: enforce nlp entity weight restrictions fix: correct typo in docs fix: typos in docs fix: fix formatting for function comment fix: restore matchNLP function previous code fix: remove blank line, make updateOne asynchronous fix: add AND operator in docs fix: handle dependency injection in chat module feat: refactor to use findAndPopulate in block score calculation feat: refine caching mechanisms feat: add typing and enforce safety checks fix: remove typo fix: remove async from block score calculation fix: remove typo fix: correct linting fix: refine nlp pattern type check fix: decompose code into helper utils, add nlp entity dto validation, remove type casting fix: minor refactoring feat: refactor current implementation
This commit is contained in:
parent
0db40680dc
commit
bab2e3082f
102
api/docs/nlp/README.md
Normal file
102
api/docs/nlp/README.md
Normal file
@ -0,0 +1,102 @@
|
|||||||
|
# NLP Block Scoring
|
||||||
|
## Purpose
|
||||||
|
|
||||||
|
**NLP Block Scoring** is a mechanism used to select the most relevant response block based on:
|
||||||
|
|
||||||
|
- Matching patterns between user input and block definitions
|
||||||
|
- Configurable weights assigned to each entity type
|
||||||
|
- Confidence values provided by the NLU engine for detected entities
|
||||||
|
|
||||||
|
It enables more intelligent and context-aware block selection in conversational flows.
|
||||||
|
|
||||||
|
## Core Use Cases
|
||||||
|
### Standard Matching
|
||||||
|
|
||||||
|
A user input contains entities that directly match a block’s patterns.
|
||||||
|
```ts
|
||||||
|
Example: Input: intent = enquiry & subject = claim
|
||||||
|
Block A: Patterns: intent: enquiry & subject: claim
|
||||||
|
Block A will be selected.
|
||||||
|
```
|
||||||
|
|
||||||
|
### High Confidence, Partial Match
|
||||||
|
|
||||||
|
A block may match only some patterns but have high-confidence input on those matched ones, making it a better candidate than others with full matches but low-confidence entities.
|
||||||
|
**Note: Confidence is multiplied by a pre-defined weight for each entity type.**
|
||||||
|
|
||||||
|
```ts
|
||||||
|
Example:
|
||||||
|
Input: intent = issue (confidence: 0.92) & subject = claim (confidence: 0.65)
|
||||||
|
Block A: Pattern: intent: issue
|
||||||
|
Block B: Pattern: subject: claim
|
||||||
|
➤ Block A gets a high score based on confidence × weight (assuming both weights are equal to 1).
|
||||||
|
```
|
||||||
|
|
||||||
|
### Multiple Blocks with Similar Patterns
|
||||||
|
|
||||||
|
```ts
|
||||||
|
Input: intent = issue & subject = insurance
|
||||||
|
Block A: intent = enquiry & subject = insurance
|
||||||
|
Block B: subject = insurance
|
||||||
|
➤ Block B is selected — Block A mismatches on intent.
|
||||||
|
```
|
||||||
|
|
||||||
|
### Exclusion Due to Extra Patterns
|
||||||
|
|
||||||
|
If a block contains patterns that require entities not present in the user input, the block is excluded from scoring altogether. No penalties are applied — the block simply isn't considered a valid candidate.
|
||||||
|
|
||||||
|
```ts
|
||||||
|
Input: intent = issue & subject = insurance
|
||||||
|
Block A: intent = enquiry & subject = insurance & location = office
|
||||||
|
Block B: subject = insurance & time = morning
|
||||||
|
➤ Neither block is selected due to unmatched required patterns (`location`, `time`)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tie-Breaking with Penalty Factors
|
||||||
|
|
||||||
|
When multiple blocks receive similar scores, penalty factors can help break the tie — especially in cases where patterns are less specific (e.g., using `Any` as a value).
|
||||||
|
|
||||||
|
```ts
|
||||||
|
Input: intent = enquiry & subject = insurance
|
||||||
|
|
||||||
|
Block A: intent = enquiry & subject = Any
|
||||||
|
Block B: intent = enquiry & subject = insurance
|
||||||
|
Block C: subject = insurance
|
||||||
|
|
||||||
|
Scoring Summary:
|
||||||
|
- Block A matches both patterns, but subject = Any is considered less specific.
|
||||||
|
- Block B has a redundant but fully specific match.
|
||||||
|
- Block C matches only one pattern.
|
||||||
|
|
||||||
|
➤ Block A and Block B have similar raw scores.
|
||||||
|
➤ A penalty factor is applied to Block A due to its use of Any, reducing its final score.
|
||||||
|
➤ Block B is selected.
|
||||||
|
```
|
||||||
|
|
||||||
|
## How Scoring Works
|
||||||
|
### Matching and Confidence
|
||||||
|
|
||||||
|
For each entity in the block's pattern:
|
||||||
|
- If the entity `matches` an entity in the user input:
|
||||||
|
- the score is increased by: `confidence × weight`
|
||||||
|
- `Confidence` is a value between 0 and 1, returned by the NLU engine.
|
||||||
|
- `Weight` (default value is `1`) is a configured importance factor for that specific entity type.
|
||||||
|
- If the match is a wildcard (i.e., the block accepts any value):
|
||||||
|
- A **penalty factor** is applied to slightly reduce its contribution:
|
||||||
|
``confidence × weight × penaltyFactor``. This encourages more specific matches when available.
|
||||||
|
|
||||||
|
### Scoring Formula Summary
|
||||||
|
|
||||||
|
For each matched entity:
|
||||||
|
|
||||||
|
```ts
|
||||||
|
score += confidence × weight × [optional penalty factor if wildcard]
|
||||||
|
```
|
||||||
|
|
||||||
|
The total block score is the sum of all matched patterns in that block.
|
||||||
|
|
||||||
|
### Penalty Factor
|
||||||
|
|
||||||
|
The **penalty factor** is a global multiplier (typically less than `1`, e.g., `0.8`) applied when the match type is less specific — such as wildcard or loose entity type matches. It allows the system to:
|
||||||
|
- Break ties in favor of more precise blocks
|
||||||
|
- Discourage overly generic blocks from being selected when better matches are available
|
@ -16,6 +16,7 @@ import { AttachmentModel } from '@/attachment/schemas/attachment.schema';
|
|||||||
import { AttachmentService } from '@/attachment/services/attachment.service';
|
import { AttachmentService } from '@/attachment/services/attachment.service';
|
||||||
import { ChannelModule } from '@/channel/channel.module';
|
import { ChannelModule } from '@/channel/channel.module';
|
||||||
import { CmsModule } from '@/cms/cms.module';
|
import { CmsModule } from '@/cms/cms.module';
|
||||||
|
import { NlpModule } from '@/nlp/nlp.module';
|
||||||
import { UserModule } from '@/user/user.module';
|
import { UserModule } from '@/user/user.module';
|
||||||
|
|
||||||
import { BlockController } from './controllers/block.controller';
|
import { BlockController } from './controllers/block.controller';
|
||||||
@ -68,6 +69,7 @@ import { SubscriberService } from './services/subscriber.service';
|
|||||||
AttachmentModule,
|
AttachmentModule,
|
||||||
EventEmitter2,
|
EventEmitter2,
|
||||||
UserModule,
|
UserModule,
|
||||||
|
NlpModule,
|
||||||
],
|
],
|
||||||
controllers: [
|
controllers: [
|
||||||
CategoryController,
|
CategoryController,
|
||||||
|
@ -20,6 +20,15 @@ import { LanguageRepository } from '@/i18n/repositories/language.repository';
|
|||||||
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
||||||
import { I18nService } from '@/i18n/services/i18n.service';
|
import { I18nService } from '@/i18n/services/i18n.service';
|
||||||
import { LanguageService } from '@/i18n/services/language.service';
|
import { LanguageService } from '@/i18n/services/language.service';
|
||||||
|
import { LoggerService } from '@/logger/logger.service';
|
||||||
|
import { NlpEntityRepository } from '@/nlp/repositories/nlp-entity.repository';
|
||||||
|
import { NlpSampleEntityRepository } from '@/nlp/repositories/nlp-sample-entity.repository';
|
||||||
|
import { NlpValueRepository } from '@/nlp/repositories/nlp-value.repository';
|
||||||
|
import { NlpEntityModel } from '@/nlp/schemas/nlp-entity.schema';
|
||||||
|
import { NlpSampleEntityModel } from '@/nlp/schemas/nlp-sample-entity.schema';
|
||||||
|
import { NlpValueModel } from '@/nlp/schemas/nlp-value.schema';
|
||||||
|
import { NlpEntityService } from '@/nlp/services/nlp-entity.service';
|
||||||
|
import { NlpValueService } from '@/nlp/services/nlp-value.service';
|
||||||
import { PluginService } from '@/plugins/plugins.service';
|
import { PluginService } from '@/plugins/plugins.service';
|
||||||
import { SettingService } from '@/setting/services/setting.service';
|
import { SettingService } from '@/setting/services/setting.service';
|
||||||
import { InvitationRepository } from '@/user/repositories/invitation.repository';
|
import { InvitationRepository } from '@/user/repositories/invitation.repository';
|
||||||
@ -93,6 +102,9 @@ describe('BlockController', () => {
|
|||||||
RoleModel,
|
RoleModel,
|
||||||
PermissionModel,
|
PermissionModel,
|
||||||
LanguageModel,
|
LanguageModel,
|
||||||
|
NlpEntityModel,
|
||||||
|
NlpSampleEntityModel,
|
||||||
|
NlpValueModel,
|
||||||
]),
|
]),
|
||||||
],
|
],
|
||||||
providers: [
|
providers: [
|
||||||
@ -116,6 +128,12 @@ describe('BlockController', () => {
|
|||||||
PermissionService,
|
PermissionService,
|
||||||
LanguageService,
|
LanguageService,
|
||||||
PluginService,
|
PluginService,
|
||||||
|
LoggerService,
|
||||||
|
NlpEntityService,
|
||||||
|
NlpEntityRepository,
|
||||||
|
NlpSampleEntityRepository,
|
||||||
|
NlpValueRepository,
|
||||||
|
NlpValueService,
|
||||||
{
|
{
|
||||||
provide: I18nService,
|
provide: I18nService,
|
||||||
useValue: {
|
useValue: {
|
||||||
|
@ -8,6 +8,8 @@
|
|||||||
|
|
||||||
import { z } from 'zod';
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import { BlockFull } from '../block.schema';
|
||||||
|
|
||||||
import { PayloadType } from './button';
|
import { PayloadType } from './button';
|
||||||
|
|
||||||
export const payloadPatternSchema = z.object({
|
export const payloadPatternSchema = z.object({
|
||||||
@ -57,3 +59,19 @@ export const patternSchema = z.union([
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
export type Pattern = z.infer<typeof patternSchema>;
|
export type Pattern = z.infer<typeof patternSchema>;
|
||||||
|
|
||||||
|
export type NlpPatternMatchResult = {
|
||||||
|
block: BlockFull;
|
||||||
|
matchedPattern: NlpPattern[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export function isNlpPattern(pattern: NlpPattern) {
|
||||||
|
return (
|
||||||
|
(typeof pattern === 'object' &&
|
||||||
|
pattern !== null &&
|
||||||
|
'entity' in pattern &&
|
||||||
|
'match' in pattern &&
|
||||||
|
pattern.match === 'entity') ||
|
||||||
|
pattern.match === 'value'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
@ -31,6 +31,14 @@ import { LanguageRepository } from '@/i18n/repositories/language.repository';
|
|||||||
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
||||||
import { I18nService } from '@/i18n/services/i18n.service';
|
import { I18nService } from '@/i18n/services/i18n.service';
|
||||||
import { LanguageService } from '@/i18n/services/language.service';
|
import { LanguageService } from '@/i18n/services/language.service';
|
||||||
|
import { NlpEntityRepository } from '@/nlp/repositories/nlp-entity.repository';
|
||||||
|
import { NlpSampleEntityRepository } from '@/nlp/repositories/nlp-sample-entity.repository';
|
||||||
|
import { NlpValueRepository } from '@/nlp/repositories/nlp-value.repository';
|
||||||
|
import { NlpEntityModel } from '@/nlp/schemas/nlp-entity.schema';
|
||||||
|
import { NlpSampleEntityModel } from '@/nlp/schemas/nlp-sample-entity.schema';
|
||||||
|
import { NlpValueModel } from '@/nlp/schemas/nlp-value.schema';
|
||||||
|
import { NlpEntityService } from '@/nlp/services/nlp-entity.service';
|
||||||
|
import { NlpValueService } from '@/nlp/services/nlp-value.service';
|
||||||
import { PluginService } from '@/plugins/plugins.service';
|
import { PluginService } from '@/plugins/plugins.service';
|
||||||
import { SettingService } from '@/setting/services/setting.service';
|
import { SettingService } from '@/setting/services/setting.service';
|
||||||
import {
|
import {
|
||||||
@ -43,12 +51,23 @@ import {
|
|||||||
blockGetStarted,
|
blockGetStarted,
|
||||||
blockProductListMock,
|
blockProductListMock,
|
||||||
blocks,
|
blocks,
|
||||||
|
mockModifiedNlpBlock,
|
||||||
|
mockModifiedNlpBlockOne,
|
||||||
|
mockModifiedNlpBlockTwo,
|
||||||
|
mockNlpBlock,
|
||||||
|
mockNlpPatternsSetOne,
|
||||||
|
mockNlpPatternsSetThree,
|
||||||
|
mockNlpPatternsSetTwo,
|
||||||
} from '@/utils/test/mocks/block';
|
} from '@/utils/test/mocks/block';
|
||||||
import {
|
import {
|
||||||
contextBlankInstance,
|
contextBlankInstance,
|
||||||
subscriberContextBlankInstance,
|
subscriberContextBlankInstance,
|
||||||
} from '@/utils/test/mocks/conversation';
|
} from '@/utils/test/mocks/conversation';
|
||||||
import { nlpEntitiesGreeting } from '@/utils/test/mocks/nlp';
|
import {
|
||||||
|
mockNlpCacheMap,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
nlpEntitiesGreeting,
|
||||||
|
} from '@/utils/test/mocks/nlp';
|
||||||
import {
|
import {
|
||||||
closeInMongodConnection,
|
closeInMongodConnection,
|
||||||
rootMongooseTestModule,
|
rootMongooseTestModule,
|
||||||
@ -56,7 +75,7 @@ import {
|
|||||||
import { buildTestingMocks } from '@/utils/test/utils';
|
import { buildTestingMocks } from '@/utils/test/utils';
|
||||||
|
|
||||||
import { BlockRepository } from '../repositories/block.repository';
|
import { BlockRepository } from '../repositories/block.repository';
|
||||||
import { Block, BlockModel } from '../schemas/block.schema';
|
import { Block, BlockFull, BlockModel } from '../schemas/block.schema';
|
||||||
import { Category, CategoryModel } from '../schemas/category.schema';
|
import { Category, CategoryModel } from '../schemas/category.schema';
|
||||||
import { LabelModel } from '../schemas/label.schema';
|
import { LabelModel } from '../schemas/label.schema';
|
||||||
import { FileType } from '../schemas/types/attachment';
|
import { FileType } from '../schemas/types/attachment';
|
||||||
@ -75,6 +94,7 @@ describe('BlockService', () => {
|
|||||||
let hasPreviousBlocks: Block;
|
let hasPreviousBlocks: Block;
|
||||||
let contentService: ContentService;
|
let contentService: ContentService;
|
||||||
let contentTypeService: ContentTypeService;
|
let contentTypeService: ContentTypeService;
|
||||||
|
let nlpEntityService: NlpEntityService;
|
||||||
|
|
||||||
beforeAll(async () => {
|
beforeAll(async () => {
|
||||||
const { getMocks } = await buildTestingMocks({
|
const { getMocks } = await buildTestingMocks({
|
||||||
@ -91,6 +111,9 @@ describe('BlockService', () => {
|
|||||||
AttachmentModel,
|
AttachmentModel,
|
||||||
LabelModel,
|
LabelModel,
|
||||||
LanguageModel,
|
LanguageModel,
|
||||||
|
NlpEntityModel,
|
||||||
|
NlpSampleEntityModel,
|
||||||
|
NlpValueModel,
|
||||||
]),
|
]),
|
||||||
],
|
],
|
||||||
providers: [
|
providers: [
|
||||||
@ -106,6 +129,14 @@ describe('BlockService', () => {
|
|||||||
ContentService,
|
ContentService,
|
||||||
AttachmentService,
|
AttachmentService,
|
||||||
LanguageService,
|
LanguageService,
|
||||||
|
NlpEntityRepository,
|
||||||
|
NlpValueRepository,
|
||||||
|
NlpSampleEntityRepository,
|
||||||
|
NlpEntityService,
|
||||||
|
{
|
||||||
|
provide: NlpValueService,
|
||||||
|
useValue: {},
|
||||||
|
},
|
||||||
{
|
{
|
||||||
provide: PluginService,
|
provide: PluginService,
|
||||||
useValue: {},
|
useValue: {},
|
||||||
@ -145,12 +176,14 @@ describe('BlockService', () => {
|
|||||||
contentTypeService,
|
contentTypeService,
|
||||||
categoryRepository,
|
categoryRepository,
|
||||||
blockRepository,
|
blockRepository,
|
||||||
|
nlpEntityService,
|
||||||
] = await getMocks([
|
] = await getMocks([
|
||||||
BlockService,
|
BlockService,
|
||||||
ContentService,
|
ContentService,
|
||||||
ContentTypeService,
|
ContentTypeService,
|
||||||
CategoryRepository,
|
CategoryRepository,
|
||||||
BlockRepository,
|
BlockRepository,
|
||||||
|
NlpEntityService,
|
||||||
]);
|
]);
|
||||||
category = (await categoryRepository.findOne({ label: 'default' }))!;
|
category = (await categoryRepository.findOne({ label: 'default' }))!;
|
||||||
hasPreviousBlocks = (await blockRepository.findOne({
|
hasPreviousBlocks = (await blockRepository.findOne({
|
||||||
@ -291,32 +324,163 @@ describe('BlockService', () => {
|
|||||||
blockGetStarted,
|
blockGetStarted,
|
||||||
);
|
);
|
||||||
expect(result).toEqual([
|
expect(result).toEqual([
|
||||||
{
|
[
|
||||||
entity: 'intent',
|
{
|
||||||
match: 'value',
|
entity: 'intent',
|
||||||
value: 'greeting',
|
match: 'value',
|
||||||
},
|
value: 'greeting',
|
||||||
{
|
},
|
||||||
entity: 'firstname',
|
{
|
||||||
match: 'entity',
|
entity: 'firstname',
|
||||||
},
|
match: 'entity',
|
||||||
|
},
|
||||||
|
],
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should return undefined when it does not match nlp patterns', () => {
|
it('should return empty array when it does not match nlp patterns', () => {
|
||||||
const result = blockService.matchNLP(nlpEntitiesGreeting, {
|
const result = blockService.matchNLP(nlpEntitiesGreeting, {
|
||||||
...blockGetStarted,
|
...blockGetStarted,
|
||||||
patterns: [[{ entity: 'lastname', match: 'value', value: 'Belakhel' }]],
|
patterns: [[{ entity: 'lastname', match: 'value', value: 'Belakhel' }]],
|
||||||
});
|
});
|
||||||
expect(result).toEqual(undefined);
|
expect(result).toEqual([]);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should return undefined when unknown nlp patterns', () => {
|
it('should return empty array when unknown nlp patterns', () => {
|
||||||
const result = blockService.matchNLP(nlpEntitiesGreeting, {
|
const result = blockService.matchNLP(nlpEntitiesGreeting, {
|
||||||
...blockGetStarted,
|
...blockGetStarted,
|
||||||
patterns: [[{ entity: 'product', match: 'value', value: 'pizza' }]],
|
patterns: [[{ entity: 'product', match: 'value', value: 'pizza' }]],
|
||||||
});
|
});
|
||||||
expect(result).toEqual(undefined);
|
expect(result).toEqual([]);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('matchBestNLP', () => {
|
||||||
|
it('should return the block with the highest NLP score', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const blocks = [mockNlpBlock, blockGetStarted]; // You can add more blocks with different patterns and scores
|
||||||
|
const nlp = mockNlpEntitiesSetOne;
|
||||||
|
// Spy on calculateBlockScore to check if it's called
|
||||||
|
const calculateBlockScoreSpy = jest.spyOn(
|
||||||
|
blockService,
|
||||||
|
'calculateBlockScore',
|
||||||
|
);
|
||||||
|
const bestBlock = await blockService.matchBestNLP(blocks, nlp);
|
||||||
|
|
||||||
|
// Ensure calculateBlockScore was called at least once for each block
|
||||||
|
expect(calculateBlockScoreSpy).toHaveBeenCalledTimes(2); // Called for each block
|
||||||
|
|
||||||
|
// Restore the spy after the test
|
||||||
|
calculateBlockScoreSpy.mockRestore();
|
||||||
|
// Assert that the block with the highest NLP score is selected
|
||||||
|
expect(bestBlock).toEqual(mockNlpBlock);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return the block with the highest NLP score applying penalties', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const blocks = [mockNlpBlock, mockModifiedNlpBlock]; // You can add more blocks with different patterns and scores
|
||||||
|
const nlp = mockNlpEntitiesSetOne;
|
||||||
|
// Spy on calculateBlockScore to check if it's called
|
||||||
|
const calculateBlockScoreSpy = jest.spyOn(
|
||||||
|
blockService,
|
||||||
|
'calculateBlockScore',
|
||||||
|
);
|
||||||
|
const bestBlock = await blockService.matchBestNLP(blocks, nlp);
|
||||||
|
|
||||||
|
// Ensure calculateBlockScore was called at least once for each block
|
||||||
|
expect(calculateBlockScoreSpy).toHaveBeenCalledTimes(2); // Called for each block
|
||||||
|
|
||||||
|
// Restore the spy after the test
|
||||||
|
calculateBlockScoreSpy.mockRestore();
|
||||||
|
// Assert that the block with the highest NLP score is selected
|
||||||
|
expect(bestBlock).toEqual(mockNlpBlock);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('another case where it should return the block with the highest NLP score applying penalties', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const blocks = [mockModifiedNlpBlockOne, mockModifiedNlpBlockTwo]; // You can add more blocks with different patterns and scores
|
||||||
|
const nlp = mockNlpEntitiesSetOne;
|
||||||
|
// Spy on calculateBlockScore to check if it's called
|
||||||
|
const calculateBlockScoreSpy = jest.spyOn(
|
||||||
|
blockService,
|
||||||
|
'calculateBlockScore',
|
||||||
|
);
|
||||||
|
const bestBlock = await blockService.matchBestNLP(blocks, nlp);
|
||||||
|
|
||||||
|
// Ensure calculateBlockScore was called at least once for each block
|
||||||
|
expect(calculateBlockScoreSpy).toHaveBeenCalledTimes(3); // Called for each block
|
||||||
|
|
||||||
|
// Restore the spy after the test
|
||||||
|
calculateBlockScoreSpy.mockRestore();
|
||||||
|
// Assert that the block with the highest NLP score is selected
|
||||||
|
expect(bestBlock).toEqual(mockModifiedNlpBlockTwo);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return undefined if no blocks match or the list is empty', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const blocks: BlockFull[] = []; // Empty block array
|
||||||
|
const nlp = mockNlpEntitiesSetOne;
|
||||||
|
|
||||||
|
const bestBlock = await blockService.matchBestNLP(blocks, nlp);
|
||||||
|
|
||||||
|
// Assert that undefined is returned when no blocks are available
|
||||||
|
expect(bestBlock).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('calculateBlockScore', () => {
|
||||||
|
it('should calculate the correct NLP score for a block', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const score = await blockService.calculateBlockScore(
|
||||||
|
mockNlpPatternsSetOne,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
);
|
||||||
|
const score2 = await blockService.calculateBlockScore(
|
||||||
|
mockNlpPatternsSetTwo,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(score).toBeGreaterThan(0);
|
||||||
|
expect(score2).toBe(0);
|
||||||
|
expect(score).toBeGreaterThan(score2);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should calculate the correct NLP score for a block and apply penalties ', async () => {
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'getNlpMap')
|
||||||
|
.mockResolvedValue(mockNlpCacheMap);
|
||||||
|
const score = await blockService.calculateBlockScore(
|
||||||
|
mockNlpPatternsSetOne,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
);
|
||||||
|
const score2 = await blockService.calculateBlockScore(
|
||||||
|
mockNlpPatternsSetThree,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(score).toBeGreaterThan(0);
|
||||||
|
expect(score2).toBeGreaterThan(0);
|
||||||
|
expect(score).toBeGreaterThan(score2);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return 0 if no matching entities are found', async () => {
|
||||||
|
jest.spyOn(nlpEntityService, 'getNlpMap').mockResolvedValue(new Map());
|
||||||
|
const score = await blockService.calculateBlockScore(
|
||||||
|
mockNlpPatternsSetTwo,
|
||||||
|
mockNlpEntitiesSetOne,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(score).toBe(0); // No matching entity, so score should be 0
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -16,6 +16,8 @@ import { CONSOLE_CHANNEL_NAME } from '@/extensions/channels/console/settings';
|
|||||||
import { NLU } from '@/helper/types';
|
import { NLU } from '@/helper/types';
|
||||||
import { I18nService } from '@/i18n/services/i18n.service';
|
import { I18nService } from '@/i18n/services/i18n.service';
|
||||||
import { LanguageService } from '@/i18n/services/language.service';
|
import { LanguageService } from '@/i18n/services/language.service';
|
||||||
|
import { NlpCacheMapValues } from '@/nlp/schemas/types';
|
||||||
|
import { NlpEntityService } from '@/nlp/services/nlp-entity.service';
|
||||||
import { PluginService } from '@/plugins/plugins.service';
|
import { PluginService } from '@/plugins/plugins.service';
|
||||||
import { PluginType } from '@/plugins/types';
|
import { PluginType } from '@/plugins/types';
|
||||||
import { SettingService } from '@/setting/services/setting.service';
|
import { SettingService } from '@/setting/services/setting.service';
|
||||||
@ -53,6 +55,7 @@ export class BlockService extends BaseService<
|
|||||||
private readonly pluginService: PluginService,
|
private readonly pluginService: PluginService,
|
||||||
protected readonly i18n: I18nService,
|
protected readonly i18n: I18nService,
|
||||||
protected readonly languageService: LanguageService,
|
protected readonly languageService: LanguageService,
|
||||||
|
protected readonly entityService: NlpEntityService,
|
||||||
) {
|
) {
|
||||||
super(repository);
|
super(repository);
|
||||||
}
|
}
|
||||||
@ -181,20 +184,21 @@ export class BlockService extends BaseService<
|
|||||||
.shift();
|
.shift();
|
||||||
|
|
||||||
// Perform an NLP Match
|
// Perform an NLP Match
|
||||||
|
|
||||||
if (!block && nlp) {
|
if (!block && nlp) {
|
||||||
// Find block pattern having the best match of nlp entities
|
// Use the `reduce` function to iterate over `filteredBlocks` and accumulate a new array `matchesWithPatterns`.
|
||||||
let nlpBest = 0;
|
// This approach combines the matching of NLP patterns and filtering of blocks with empty or invalid matches
|
||||||
filteredBlocks.forEach((b, index, self) => {
|
// into a single operation. This avoids the need for a separate mapping and filtering step, improving performance.
|
||||||
const nlpPattern = this.matchNLP(nlp, b);
|
// For each block in `filteredBlocks`, we call `matchNLP` to find patterns that match the NLP data.
|
||||||
if (nlpPattern && nlpPattern.length > nlpBest) {
|
// If `matchNLP` returns a non-empty list of matched patterns, the block and its matched patterns are added
|
||||||
nlpBest = nlpPattern.length;
|
// to the accumulator array `acc`, which is returned as the final result.
|
||||||
block = self[index];
|
// This ensures that only blocks with valid matches are kept, and blocks with no matches are excluded,
|
||||||
}
|
// all while iterating through the list only once.
|
||||||
});
|
|
||||||
|
block = await this.matchBestNLP(filteredBlocks, nlp);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// Uknown event type => return false;
|
|
||||||
// this.logger.error('Unable to recognize event type while matching', event);
|
|
||||||
return block;
|
return block;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -304,7 +308,7 @@ export class BlockService extends BaseService<
|
|||||||
matchNLP(
|
matchNLP(
|
||||||
nlp: NLU.ParseEntities,
|
nlp: NLU.ParseEntities,
|
||||||
block: Block | BlockFull,
|
block: Block | BlockFull,
|
||||||
): NlpPattern[] | undefined {
|
): NlpPattern[][] | undefined {
|
||||||
// No nlp entities to check against
|
// No nlp entities to check against
|
||||||
if (nlp.entities.length === 0) {
|
if (nlp.entities.length === 0) {
|
||||||
return undefined;
|
return undefined;
|
||||||
@ -313,14 +317,13 @@ export class BlockService extends BaseService<
|
|||||||
const nlpPatterns = block.patterns?.filter((p) => {
|
const nlpPatterns = block.patterns?.filter((p) => {
|
||||||
return Array.isArray(p);
|
return Array.isArray(p);
|
||||||
}) as NlpPattern[][];
|
}) as NlpPattern[][];
|
||||||
|
|
||||||
// No nlp patterns found
|
// No nlp patterns found
|
||||||
if (nlpPatterns.length === 0) {
|
if (nlpPatterns.length === 0) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Find NLP pattern match based on best guessed entities
|
// Find NLP pattern match based on best guessed entities
|
||||||
return nlpPatterns.find((entities: NlpPattern[]) => {
|
return nlpPatterns.filter((entities: NlpPattern[]) => {
|
||||||
return entities.every((ev: NlpPattern) => {
|
return entities.every((ev: NlpPattern) => {
|
||||||
if (ev.match === 'value') {
|
if (ev.match === 'value') {
|
||||||
return nlp.entities.find((e) => {
|
return nlp.entities.find((e) => {
|
||||||
@ -338,6 +341,139 @@ export class BlockService extends BaseService<
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Matches the provided NLU parsed entities with patterns in a set of blocks and returns
|
||||||
|
* the block with the highest matching score.
|
||||||
|
*
|
||||||
|
* For each block, it checks the patterns against the NLU parsed entities, calculates
|
||||||
|
* a score for each match, and selects the block with the highest score.
|
||||||
|
*
|
||||||
|
* @param {BlockFull[]} blocks - An array of BlockFull objects representing potential matches.
|
||||||
|
* @param {NLU.ParseEntities} nlp - The NLU parsed entities used for pattern matching.
|
||||||
|
*
|
||||||
|
* @returns {Promise<BlockFull | undefined>} - A promise that resolves to the BlockFull
|
||||||
|
* with the highest match score, or undefined if no matches are found.
|
||||||
|
*/
|
||||||
|
async matchBestNLP(
|
||||||
|
blocks: BlockFull[],
|
||||||
|
nlp: NLU.ParseEntities,
|
||||||
|
): Promise<BlockFull | undefined> {
|
||||||
|
const scoredBlocks = await Promise.all(
|
||||||
|
blocks.map(async (block) => {
|
||||||
|
const matchedPatterns = this.matchNLP(nlp, block) || [];
|
||||||
|
|
||||||
|
const scores = await Promise.all(
|
||||||
|
matchedPatterns.map((pattern) =>
|
||||||
|
this.calculateBlockScore(pattern, nlp),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
const maxScore = scores.length > 0 ? Math.max(...scores) : 0;
|
||||||
|
|
||||||
|
return { block, score: maxScore };
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
const best = scoredBlocks.reduce(
|
||||||
|
(acc, curr) => (curr.score > acc.score ? curr : acc),
|
||||||
|
{ block: undefined, score: 0 },
|
||||||
|
);
|
||||||
|
|
||||||
|
return best.block;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Computes the NLP score for a given block using its matched NLP patterns and parsed NLP entities.
|
||||||
|
*
|
||||||
|
* Each pattern is evaluated against the parsed NLP entities to determine matches based on entity name,
|
||||||
|
* value, and confidence. A score is computed using the entity's weight and the confidence level of the match.
|
||||||
|
* A penalty factor is optionally applied for entity-level matches to adjust the scoring.
|
||||||
|
*
|
||||||
|
* The function uses a cache (`nlpCacheMap`) to avoid redundant database lookups for entity metadata.
|
||||||
|
*
|
||||||
|
* @param patterns - The NLP patterns associated with the block.
|
||||||
|
* @param nlp - The parsed NLP entities from the user input.
|
||||||
|
* @param nlpCacheMap - A cache to reuse fetched entity metadata (e.g., weights and valid values).
|
||||||
|
* @param nlpPenaltyFactor - A multiplier applied to scores when the pattern match type is 'entity'.
|
||||||
|
* @returns A numeric score representing how well the block matches the given NLP context.
|
||||||
|
*/
|
||||||
|
async calculateBlockScore(
|
||||||
|
patterns: NlpPattern[],
|
||||||
|
nlp: NLU.ParseEntities,
|
||||||
|
): Promise<number> {
|
||||||
|
if (!patterns.length) return 0;
|
||||||
|
|
||||||
|
const nlpCacheMap = await this.entityService.getNlpMap();
|
||||||
|
// @TODO Make nluPenaltyFactor configurable in UI settings
|
||||||
|
const nluPenaltyFactor = 0.95;
|
||||||
|
// Compute individual pattern scores using the cache
|
||||||
|
const patternScores: number[] = patterns.map((pattern) => {
|
||||||
|
const entityData = nlpCacheMap.get(pattern.entity);
|
||||||
|
if (!entityData) return 0;
|
||||||
|
|
||||||
|
const matchedEntity: NLU.ParseEntity | undefined = nlp.entities.find(
|
||||||
|
(e) => this.matchesEntityData(e, pattern, entityData),
|
||||||
|
);
|
||||||
|
|
||||||
|
return this.computePatternScore(
|
||||||
|
matchedEntity,
|
||||||
|
pattern,
|
||||||
|
entityData,
|
||||||
|
nluPenaltyFactor,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Sum the scores
|
||||||
|
return patternScores.reduce((sum, score) => sum + score, 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a given `ParseEntity` from the NLP model matches the specified pattern
|
||||||
|
* and if its value exists within the values provided in the cache for the specified entity.
|
||||||
|
*
|
||||||
|
* @param e - The `ParseEntity` object from the NLP model, containing information about the entity and its value.
|
||||||
|
* @param pattern - The `NlpPattern` object representing the entity and value pattern to be matched.
|
||||||
|
* @param entityData - The `NlpCacheMapValues` object containing cached data, including entity values and weight, for the entity being matched.
|
||||||
|
*
|
||||||
|
* @returns A boolean indicating whether the `ParseEntity` matches the pattern and entity data from the cache.
|
||||||
|
*
|
||||||
|
* - The function compares the entity type between the `ParseEntity` and the `NlpPattern`.
|
||||||
|
* - If the pattern's match type is not `'value'`, it checks if the entity's value is present in the cache's `values` array.
|
||||||
|
* - If the pattern's match type is `'value'`, it further ensures that the entity's value matches the specified value in the pattern.
|
||||||
|
* - Returns `true` if all conditions are met, otherwise `false`.
|
||||||
|
*/
|
||||||
|
private matchesEntityData(
|
||||||
|
e: NLU.ParseEntity,
|
||||||
|
pattern: NlpPattern,
|
||||||
|
entityData: NlpCacheMapValues,
|
||||||
|
): boolean {
|
||||||
|
return (
|
||||||
|
e.entity === pattern.entity &&
|
||||||
|
entityData?.values.some((v) => v === e.value) &&
|
||||||
|
(pattern.match !== 'value' || e.value === pattern.value)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Computes the score for a given entity based on its confidence, weight, and penalty factor.
|
||||||
|
*
|
||||||
|
* @param entity - The `ParseEntity` to check, which may be `undefined` if no match is found.
|
||||||
|
* @param pattern - The `NlpPattern` object that specifies how to match the entity and its value.
|
||||||
|
* @param entityData - The cached data for the given entity, including `weight` and `values`.
|
||||||
|
* @param nlpPenaltyFactor - The penalty factor applied when the pattern's match type is 'entity'.
|
||||||
|
* @returns The computed score based on the entity's confidence, the cached weight, and the penalty factor.
|
||||||
|
*/
|
||||||
|
private computePatternScore(
|
||||||
|
entity: NLU.ParseEntity | undefined,
|
||||||
|
pattern: NlpPattern,
|
||||||
|
entityData: NlpCacheMapValues,
|
||||||
|
nlpPenaltyFactor: number,
|
||||||
|
): number {
|
||||||
|
if (!entity || !entity.confidence) return 0;
|
||||||
|
const penalty = pattern.match === 'entity' ? nlpPenaltyFactor : 1;
|
||||||
|
return entity.confidence * entityData.weight * penalty;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Matches an outcome-based block from a list of available blocks
|
* Matches an outcome-based block from a list of available blocks
|
||||||
* based on the outcome of a system message.
|
* based on the outcome of a system message.
|
||||||
|
@ -33,6 +33,14 @@ import { LanguageRepository } from '@/i18n/repositories/language.repository';
|
|||||||
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
import { LanguageModel } from '@/i18n/schemas/language.schema';
|
||||||
import { I18nService } from '@/i18n/services/i18n.service';
|
import { I18nService } from '@/i18n/services/i18n.service';
|
||||||
import { LanguageService } from '@/i18n/services/language.service';
|
import { LanguageService } from '@/i18n/services/language.service';
|
||||||
|
import { NlpEntityRepository } from '@/nlp/repositories/nlp-entity.repository';
|
||||||
|
import { NlpSampleEntityRepository } from '@/nlp/repositories/nlp-sample-entity.repository';
|
||||||
|
import { NlpValueRepository } from '@/nlp/repositories/nlp-value.repository';
|
||||||
|
import { NlpEntityModel } from '@/nlp/schemas/nlp-entity.schema';
|
||||||
|
import { NlpSampleEntityModel } from '@/nlp/schemas/nlp-sample-entity.schema';
|
||||||
|
import { NlpValueModel } from '@/nlp/schemas/nlp-value.schema';
|
||||||
|
import { NlpEntityService } from '@/nlp/services/nlp-entity.service';
|
||||||
|
import { NlpValueService } from '@/nlp/services/nlp-value.service';
|
||||||
import { PluginService } from '@/plugins/plugins.service';
|
import { PluginService } from '@/plugins/plugins.service';
|
||||||
import { SettingService } from '@/setting/services/setting.service';
|
import { SettingService } from '@/setting/services/setting.service';
|
||||||
import { installBlockFixtures } from '@/utils/test/fixtures/block';
|
import { installBlockFixtures } from '@/utils/test/fixtures/block';
|
||||||
@ -100,6 +108,9 @@ describe('BlockService', () => {
|
|||||||
MenuModel,
|
MenuModel,
|
||||||
ContextVarModel,
|
ContextVarModel,
|
||||||
LanguageModel,
|
LanguageModel,
|
||||||
|
NlpEntityModel,
|
||||||
|
NlpSampleEntityModel,
|
||||||
|
NlpValueModel,
|
||||||
]),
|
]),
|
||||||
JwtModule,
|
JwtModule,
|
||||||
],
|
],
|
||||||
@ -131,6 +142,11 @@ describe('BlockService', () => {
|
|||||||
ContextVarService,
|
ContextVarService,
|
||||||
ContextVarRepository,
|
ContextVarRepository,
|
||||||
LanguageService,
|
LanguageService,
|
||||||
|
NlpEntityService,
|
||||||
|
NlpEntityRepository,
|
||||||
|
NlpSampleEntityRepository,
|
||||||
|
NlpValueRepository,
|
||||||
|
NlpValueService,
|
||||||
{
|
{
|
||||||
provide: HelperService,
|
provide: HelperService,
|
||||||
useValue: {},
|
useValue: {},
|
||||||
|
@ -139,6 +139,7 @@ describe('BaseNlpHelper', () => {
|
|||||||
updatedAt: new Date(),
|
updatedAt: new Date(),
|
||||||
builtin: false,
|
builtin: false,
|
||||||
lookups: [],
|
lookups: [],
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
entity2: {
|
entity2: {
|
||||||
id: new ObjectId().toString(),
|
id: new ObjectId().toString(),
|
||||||
@ -147,6 +148,7 @@ describe('BaseNlpHelper', () => {
|
|||||||
updatedAt: new Date(),
|
updatedAt: new Date(),
|
||||||
builtin: false,
|
builtin: false,
|
||||||
lookups: [],
|
lookups: [],
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
jest.spyOn(NlpValue, 'getValueMap').mockReturnValue({
|
jest.spyOn(NlpValue, 'getValueMap').mockReturnValue({
|
||||||
@ -207,6 +209,7 @@ describe('BaseNlpHelper', () => {
|
|||||||
updatedAt: new Date(),
|
updatedAt: new Date(),
|
||||||
builtin: false,
|
builtin: false,
|
||||||
lookups: [],
|
lookups: [],
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -30,6 +30,14 @@ import { MenuModel } from '@/cms/schemas/menu.schema';
|
|||||||
import { ContentService } from '@/cms/services/content.service';
|
import { ContentService } from '@/cms/services/content.service';
|
||||||
import { MenuService } from '@/cms/services/menu.service';
|
import { MenuService } from '@/cms/services/menu.service';
|
||||||
import { I18nService } from '@/i18n/services/i18n.service';
|
import { I18nService } from '@/i18n/services/i18n.service';
|
||||||
|
import { NlpEntityRepository } from '@/nlp/repositories/nlp-entity.repository';
|
||||||
|
import { NlpSampleEntityRepository } from '@/nlp/repositories/nlp-sample-entity.repository';
|
||||||
|
import { NlpValueRepository } from '@/nlp/repositories/nlp-value.repository';
|
||||||
|
import { NlpEntityModel } from '@/nlp/schemas/nlp-entity.schema';
|
||||||
|
import { NlpSampleEntityModel } from '@/nlp/schemas/nlp-sample-entity.schema';
|
||||||
|
import { NlpValueModel } from '@/nlp/schemas/nlp-value.schema';
|
||||||
|
import { NlpEntityService } from '@/nlp/services/nlp-entity.service';
|
||||||
|
import { NlpValueService } from '@/nlp/services/nlp-value.service';
|
||||||
import { NlpService } from '@/nlp/services/nlp.service';
|
import { NlpService } from '@/nlp/services/nlp.service';
|
||||||
import { PluginService } from '@/plugins/plugins.service';
|
import { PluginService } from '@/plugins/plugins.service';
|
||||||
import { SettingService } from '@/setting/services/setting.service';
|
import { SettingService } from '@/setting/services/setting.service';
|
||||||
@ -75,6 +83,9 @@ describe('TranslationController', () => {
|
|||||||
BlockModel,
|
BlockModel,
|
||||||
ContentModel,
|
ContentModel,
|
||||||
LanguageModel,
|
LanguageModel,
|
||||||
|
NlpEntityModel,
|
||||||
|
NlpSampleEntityModel,
|
||||||
|
NlpValueModel,
|
||||||
]),
|
]),
|
||||||
],
|
],
|
||||||
providers: [
|
providers: [
|
||||||
@ -130,6 +141,11 @@ describe('TranslationController', () => {
|
|||||||
},
|
},
|
||||||
LanguageService,
|
LanguageService,
|
||||||
LanguageRepository,
|
LanguageRepository,
|
||||||
|
NlpEntityRepository,
|
||||||
|
NlpEntityService,
|
||||||
|
NlpValueRepository,
|
||||||
|
NlpValueService,
|
||||||
|
NlpSampleEntityRepository,
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[translationService, translationController] = await getMocks([
|
[translationService, translationController] = await getMocks([
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
import {
|
import {
|
||||||
BadRequestException,
|
BadRequestException,
|
||||||
MethodNotAllowedException,
|
MethodNotAllowedException,
|
||||||
@ -67,6 +68,12 @@ describe('NlpEntityController', () => {
|
|||||||
NlpValueService,
|
NlpValueService,
|
||||||
NlpSampleEntityRepository,
|
NlpSampleEntityRepository,
|
||||||
NlpValueRepository,
|
NlpValueRepository,
|
||||||
|
{
|
||||||
|
provide: CACHE_MANAGER,
|
||||||
|
useValue: {
|
||||||
|
del: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[nlpEntityController, nlpValueService, nlpEntityService] = await getMocks([
|
[nlpEntityController, nlpValueService, nlpEntityService] = await getMocks([
|
||||||
@ -109,6 +116,7 @@ describe('NlpEntityController', () => {
|
|||||||
) as NlpEntityFull['values'],
|
) as NlpEntityFull['values'],
|
||||||
lookups: curr.lookups!,
|
lookups: curr.lookups!,
|
||||||
builtin: curr.builtin!,
|
builtin: curr.builtin!,
|
||||||
|
weight: curr.weight!,
|
||||||
});
|
});
|
||||||
return acc;
|
return acc;
|
||||||
},
|
},
|
||||||
@ -163,6 +171,7 @@ describe('NlpEntityController', () => {
|
|||||||
name: 'sentiment',
|
name: 'sentiment',
|
||||||
lookups: ['trait'],
|
lookups: ['trait'],
|
||||||
builtin: false,
|
builtin: false,
|
||||||
|
weight: 1,
|
||||||
};
|
};
|
||||||
const result = await nlpEntityController.create(sentimentEntity);
|
const result = await nlpEntityController.create(sentimentEntity);
|
||||||
expect(result).toEqualPayload(sentimentEntity);
|
expect(result).toEqualPayload(sentimentEntity);
|
||||||
@ -214,6 +223,7 @@ describe('NlpEntityController', () => {
|
|||||||
updatedAt: firstNameEntity!.updatedAt,
|
updatedAt: firstNameEntity!.updatedAt,
|
||||||
lookups: firstNameEntity!.lookups,
|
lookups: firstNameEntity!.lookups,
|
||||||
builtin: firstNameEntity!.builtin,
|
builtin: firstNameEntity!.builtin,
|
||||||
|
weight: firstNameEntity!.weight,
|
||||||
};
|
};
|
||||||
const result = await nlpEntityController.findOne(firstNameEntity!.id, [
|
const result = await nlpEntityController.findOne(firstNameEntity!.id, [
|
||||||
'values',
|
'values',
|
||||||
@ -238,6 +248,7 @@ describe('NlpEntityController', () => {
|
|||||||
doc: '',
|
doc: '',
|
||||||
lookups: ['trait'],
|
lookups: ['trait'],
|
||||||
builtin: false,
|
builtin: false,
|
||||||
|
weight: 1,
|
||||||
};
|
};
|
||||||
const result = await nlpEntityController.updateOne(
|
const result = await nlpEntityController.updateOne(
|
||||||
firstNameEntity!.id,
|
firstNameEntity!.id,
|
||||||
@ -258,7 +269,7 @@ describe('NlpEntityController', () => {
|
|||||||
).rejects.toThrow(NotFoundException);
|
).rejects.toThrow(NotFoundException);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should throw exception when nlp entity is builtin', async () => {
|
it('should throw an exception if entity is builtin but weight not provided', async () => {
|
||||||
const updateNlpEntity: NlpEntityCreateDto = {
|
const updateNlpEntity: NlpEntityCreateDto = {
|
||||||
name: 'updated',
|
name: 'updated',
|
||||||
doc: '',
|
doc: '',
|
||||||
@ -269,6 +280,57 @@ describe('NlpEntityController', () => {
|
|||||||
nlpEntityController.updateOne(buitInEntityId!, updateNlpEntity),
|
nlpEntityController.updateOne(buitInEntityId!, updateNlpEntity),
|
||||||
).rejects.toThrow(MethodNotAllowedException);
|
).rejects.toThrow(MethodNotAllowedException);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should update weight if entity is builtin and weight is provided', async () => {
|
||||||
|
const updatedNlpEntity: NlpEntityCreateDto = {
|
||||||
|
name: 'updated',
|
||||||
|
doc: '',
|
||||||
|
lookups: ['trait'],
|
||||||
|
builtin: false,
|
||||||
|
weight: 4,
|
||||||
|
};
|
||||||
|
const findOneSpy = jest.spyOn(nlpEntityService, 'findOne');
|
||||||
|
const updateWeightSpy = jest.spyOn(nlpEntityService, 'updateWeight');
|
||||||
|
|
||||||
|
const result = await nlpEntityController.updateOne(
|
||||||
|
buitInEntityId!,
|
||||||
|
updatedNlpEntity,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(findOneSpy).toHaveBeenCalledWith(buitInEntityId!);
|
||||||
|
expect(updateWeightSpy).toHaveBeenCalledWith(
|
||||||
|
buitInEntityId!,
|
||||||
|
updatedNlpEntity.weight,
|
||||||
|
);
|
||||||
|
expect(result.weight).toBe(updatedNlpEntity.weight);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should update only the weight of the builtin entity', async () => {
|
||||||
|
const updatedNlpEntity: NlpEntityCreateDto = {
|
||||||
|
name: 'updated',
|
||||||
|
doc: '',
|
||||||
|
lookups: ['trait'],
|
||||||
|
builtin: false,
|
||||||
|
weight: 4,
|
||||||
|
};
|
||||||
|
const originalEntity: NlpEntity | null = await nlpEntityService.findOne(
|
||||||
|
buitInEntityId!,
|
||||||
|
);
|
||||||
|
|
||||||
|
const result: NlpEntity = await nlpEntityController.updateOne(
|
||||||
|
buitInEntityId!,
|
||||||
|
updatedNlpEntity,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Check weight is updated
|
||||||
|
expect(result.weight).toBe(updatedNlpEntity.weight);
|
||||||
|
|
||||||
|
Object.entries(originalEntity!).forEach(([key, value]) => {
|
||||||
|
if (key !== 'weight' && key !== 'updatedAt') {
|
||||||
|
expect(result[key as keyof typeof result]).toEqual(value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
describe('deleteMany', () => {
|
describe('deleteMany', () => {
|
||||||
it('should delete multiple nlp entities', async () => {
|
it('should delete multiple nlp entities', async () => {
|
||||||
|
@ -157,10 +157,19 @@ export class NlpEntityController extends BaseController<
|
|||||||
this.logger.warn(`Unable to update NLP Entity by id ${id}`);
|
this.logger.warn(`Unable to update NLP Entity by id ${id}`);
|
||||||
throw new NotFoundException(`NLP Entity with ID ${id} not found`);
|
throw new NotFoundException(`NLP Entity with ID ${id} not found`);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (nlpEntity.builtin) {
|
if (nlpEntity.builtin) {
|
||||||
throw new MethodNotAllowedException(
|
// Only allow weight update for builtin entities
|
||||||
`Cannot update builtin NLP Entity ${nlpEntity.name}`,
|
if (updateNlpEntityDto.weight) {
|
||||||
);
|
return await this.nlpEntityService.updateWeight(
|
||||||
|
id,
|
||||||
|
updateNlpEntityDto.weight,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
throw new MethodNotAllowedException(
|
||||||
|
`Cannot update builtin NLP Entity ${nlpEntity.name} except for weight`,
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return await this.nlpEntityService.updateOne(id, updateNlpEntityDto);
|
return await this.nlpEntityService.updateOne(id, updateNlpEntityDto);
|
||||||
|
@ -372,6 +372,7 @@ describe('NlpSampleController', () => {
|
|||||||
lookups: ['trait'],
|
lookups: ['trait'],
|
||||||
doc: '',
|
doc: '',
|
||||||
builtin: false,
|
builtin: false,
|
||||||
|
weight: 1,
|
||||||
};
|
};
|
||||||
const priceValueEntity = await nlpEntityService.findOne({
|
const priceValueEntity = await nlpEntityService.findOne({
|
||||||
name: 'intent',
|
name: 'intent',
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
import { BadRequestException, NotFoundException } from '@nestjs/common';
|
import { BadRequestException, NotFoundException } from '@nestjs/common';
|
||||||
import { MongooseModule } from '@nestjs/mongoose';
|
import { MongooseModule } from '@nestjs/mongoose';
|
||||||
|
|
||||||
@ -57,6 +58,12 @@ describe('NlpValueController', () => {
|
|||||||
NlpSampleEntityRepository,
|
NlpSampleEntityRepository,
|
||||||
NlpEntityService,
|
NlpEntityService,
|
||||||
NlpEntityRepository,
|
NlpEntityRepository,
|
||||||
|
{
|
||||||
|
provide: CACHE_MANAGER,
|
||||||
|
useValue: {
|
||||||
|
del: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[nlpValueController, nlpValueService, nlpEntityService] = await getMocks([
|
[nlpValueController, nlpValueService, nlpEntityService] = await getMocks([
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -11,10 +11,13 @@ import {
|
|||||||
IsArray,
|
IsArray,
|
||||||
IsBoolean,
|
IsBoolean,
|
||||||
IsIn,
|
IsIn,
|
||||||
|
IsInt,
|
||||||
IsNotEmpty,
|
IsNotEmpty,
|
||||||
|
IsNumber,
|
||||||
IsOptional,
|
IsOptional,
|
||||||
IsString,
|
IsString,
|
||||||
Matches,
|
Matches,
|
||||||
|
Min,
|
||||||
} from 'class-validator';
|
} from 'class-validator';
|
||||||
|
|
||||||
import { DtoConfig } from '@/utils/types/dto.types';
|
import { DtoConfig } from '@/utils/types/dto.types';
|
||||||
@ -47,6 +50,17 @@ export class NlpEntityCreateDto {
|
|||||||
@IsBoolean()
|
@IsBoolean()
|
||||||
@IsOptional()
|
@IsOptional()
|
||||||
builtin?: boolean;
|
builtin?: boolean;
|
||||||
|
|
||||||
|
@ApiPropertyOptional({
|
||||||
|
description: 'Nlp entity associated weight for next block triggering',
|
||||||
|
type: Number,
|
||||||
|
minimum: 1,
|
||||||
|
})
|
||||||
|
@IsNumber()
|
||||||
|
@IsOptional()
|
||||||
|
@Min(1, { message: 'Weight must be a positive integer' })
|
||||||
|
@IsInt({ message: 'Weight must be an integer' })
|
||||||
|
weight?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
export type NlpEntityDto = DtoConfig<{
|
export type NlpEntityDto = DtoConfig<{
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -58,6 +58,12 @@ export class NlpEntityStub extends BaseSchema {
|
|||||||
@Prop({ type: Boolean, default: false })
|
@Prop({ type: Boolean, default: false })
|
||||||
builtin: boolean;
|
builtin: boolean;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Entity's weight used to determine the next block to trigger in the conversational flow.
|
||||||
|
*/
|
||||||
|
@Prop({ type: Number, default: 1, min: 0 })
|
||||||
|
weight: number;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Returns a map object for entities
|
* Returns a map object for entities
|
||||||
* @param entities - Array of entities
|
* @param entities - Array of entities
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -25,3 +25,11 @@ export enum NlpSampleState {
|
|||||||
test = 'test',
|
test = 'test',
|
||||||
inbox = 'inbox',
|
inbox = 'inbox',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export type NlpCacheMap = Map<string, NlpCacheMapValues>;
|
||||||
|
|
||||||
|
export type NlpCacheMapValues = {
|
||||||
|
id: string;
|
||||||
|
weight: number;
|
||||||
|
values: string[];
|
||||||
|
};
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
import { MongooseModule } from '@nestjs/mongoose';
|
import { MongooseModule } from '@nestjs/mongoose';
|
||||||
|
|
||||||
import { nlpEntityFixtures } from '@/utils/test/fixtures/nlpentity';
|
import { nlpEntityFixtures } from '@/utils/test/fixtures/nlpentity';
|
||||||
@ -20,7 +21,11 @@ import { buildTestingMocks } from '@/utils/test/utils';
|
|||||||
import { NlpEntityRepository } from '../repositories/nlp-entity.repository';
|
import { NlpEntityRepository } from '../repositories/nlp-entity.repository';
|
||||||
import { NlpSampleEntityRepository } from '../repositories/nlp-sample-entity.repository';
|
import { NlpSampleEntityRepository } from '../repositories/nlp-sample-entity.repository';
|
||||||
import { NlpValueRepository } from '../repositories/nlp-value.repository';
|
import { NlpValueRepository } from '../repositories/nlp-value.repository';
|
||||||
import { NlpEntity, NlpEntityModel } from '../schemas/nlp-entity.schema';
|
import {
|
||||||
|
NlpEntity,
|
||||||
|
NlpEntityFull,
|
||||||
|
NlpEntityModel,
|
||||||
|
} from '../schemas/nlp-entity.schema';
|
||||||
import { NlpSampleEntityModel } from '../schemas/nlp-sample-entity.schema';
|
import { NlpSampleEntityModel } from '../schemas/nlp-sample-entity.schema';
|
||||||
import { NlpValueModel } from '../schemas/nlp-value.schema';
|
import { NlpValueModel } from '../schemas/nlp-value.schema';
|
||||||
|
|
||||||
@ -48,6 +53,12 @@ describe('nlpEntityService', () => {
|
|||||||
NlpValueService,
|
NlpValueService,
|
||||||
NlpValueRepository,
|
NlpValueRepository,
|
||||||
NlpSampleEntityRepository,
|
NlpSampleEntityRepository,
|
||||||
|
{
|
||||||
|
provide: CACHE_MANAGER,
|
||||||
|
useValue: {
|
||||||
|
del: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[nlpEntityService, nlpEntityRepository, nlpValueRepository] =
|
[nlpEntityService, nlpEntityRepository, nlpValueRepository] =
|
||||||
@ -117,6 +128,77 @@ describe('nlpEntityService', () => {
|
|||||||
expect(result).toEqualPayload(entitiesWithValues);
|
expect(result).toEqualPayload(entitiesWithValues);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
describe('NlpEntityService - updateWeight', () => {
|
||||||
|
let createdEntity: NlpEntity;
|
||||||
|
beforeEach(async () => {
|
||||||
|
createdEntity = await nlpEntityRepository.create({
|
||||||
|
name: 'testentity',
|
||||||
|
builtin: false,
|
||||||
|
weight: 3,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should update the weight of an NLP entity', async () => {
|
||||||
|
const newWeight = 8;
|
||||||
|
|
||||||
|
const updatedEntity = await nlpEntityService.updateWeight(
|
||||||
|
createdEntity.id,
|
||||||
|
newWeight,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(updatedEntity.weight).toBe(newWeight);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle updating weight of non-existent entity', async () => {
|
||||||
|
const nonExistentId = '507f1f77bcf86cd799439011'; // Example MongoDB ObjectId
|
||||||
|
|
||||||
|
try {
|
||||||
|
await nlpEntityService.updateWeight(nonExistentId, 5);
|
||||||
|
fail('Expected error was not thrown');
|
||||||
|
} catch (error) {
|
||||||
|
expect(error).toBeDefined();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use default weight of 1 when creating entity without weight', async () => {
|
||||||
|
const createdEntity = await nlpEntityRepository.create({
|
||||||
|
name: 'entityWithoutWeight',
|
||||||
|
builtin: true,
|
||||||
|
// weight not specified
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(createdEntity.weight).toBe(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should throw an error if weight is less than 1', async () => {
|
||||||
|
const invalidWeight = 0;
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
nlpEntityService.updateWeight(createdEntity.id, invalidWeight),
|
||||||
|
).rejects.toThrow('Weight must be a positive integer');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should throw an error if weight is a decimal', async () => {
|
||||||
|
const invalidWeight = 2.5;
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
nlpEntityService.updateWeight(createdEntity.id, invalidWeight),
|
||||||
|
).rejects.toThrow('Weight must be a positive integer');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should throw an error if weight is negative', async () => {
|
||||||
|
const invalidWeight = -3;
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
nlpEntityService.updateWeight(createdEntity.id, invalidWeight),
|
||||||
|
).rejects.toThrow('Weight must be a positive integer');
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
// Clean the collection after each test
|
||||||
|
await nlpEntityRepository.deleteOne(createdEntity.id);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
describe('storeNewEntities', () => {
|
describe('storeNewEntities', () => {
|
||||||
it('should store new entities', async () => {
|
it('should store new entities', async () => {
|
||||||
@ -150,4 +232,47 @@ describe('nlpEntityService', () => {
|
|||||||
expect(result).toEqualPayload(storedEntites);
|
expect(result).toEqualPayload(storedEntites);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
describe('getNlpMap', () => {
|
||||||
|
it('should return a NlpCacheMap with the correct structure', async () => {
|
||||||
|
// Arrange
|
||||||
|
const firstMockValues = {
|
||||||
|
id: '1',
|
||||||
|
weight: 1,
|
||||||
|
};
|
||||||
|
const firstMockEntity = {
|
||||||
|
name: 'intent',
|
||||||
|
...firstMockValues,
|
||||||
|
values: [{ value: 'buy' }, { value: 'sell' }],
|
||||||
|
} as unknown as Partial<NlpEntityFull>;
|
||||||
|
const secondMockValues = {
|
||||||
|
id: '2',
|
||||||
|
weight: 5,
|
||||||
|
};
|
||||||
|
const secondMockEntity = {
|
||||||
|
name: 'subject',
|
||||||
|
...secondMockValues,
|
||||||
|
values: [{ value: 'product' }],
|
||||||
|
} as unknown as Partial<NlpEntityFull>;
|
||||||
|
const mockEntities = [firstMockEntity, secondMockEntity];
|
||||||
|
|
||||||
|
// Mock findAndPopulate
|
||||||
|
jest
|
||||||
|
.spyOn(nlpEntityService, 'findAllAndPopulate')
|
||||||
|
.mockResolvedValue(mockEntities as unknown as NlpEntityFull[]);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
const result = await nlpEntityService.getNlpMap();
|
||||||
|
|
||||||
|
expect(result).toBeInstanceOf(Map);
|
||||||
|
expect(result.size).toBe(2);
|
||||||
|
expect(result.get('intent')).toEqual({
|
||||||
|
name: 'intent',
|
||||||
|
...firstMockEntity,
|
||||||
|
});
|
||||||
|
expect(result.get('subject')).toEqual({
|
||||||
|
name: 'subject',
|
||||||
|
...secondMockEntity,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
@ -1,13 +1,18 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import { Injectable } from '@nestjs/common';
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
|
import { Inject, Injectable } from '@nestjs/common';
|
||||||
|
import { OnEvent } from '@nestjs/event-emitter';
|
||||||
|
import { Cache } from 'cache-manager';
|
||||||
|
|
||||||
|
import { NLP_MAP_CACHE_KEY } from '@/utils/constants/cache';
|
||||||
|
import { Cacheable } from '@/utils/decorators/cacheable.decorator';
|
||||||
import { BaseService } from '@/utils/generics/base-service';
|
import { BaseService } from '@/utils/generics/base-service';
|
||||||
|
|
||||||
import { Lookup, NlpEntityDto } from '../dto/nlp-entity.dto';
|
import { Lookup, NlpEntityDto } from '../dto/nlp-entity.dto';
|
||||||
@ -17,7 +22,7 @@ import {
|
|||||||
NlpEntityFull,
|
NlpEntityFull,
|
||||||
NlpEntityPopulate,
|
NlpEntityPopulate,
|
||||||
} from '../schemas/nlp-entity.schema';
|
} from '../schemas/nlp-entity.schema';
|
||||||
import { NlpSampleEntityValue } from '../schemas/types';
|
import { NlpCacheMap, NlpSampleEntityValue } from '../schemas/types';
|
||||||
|
|
||||||
import { NlpValueService } from './nlp-value.service';
|
import { NlpValueService } from './nlp-value.service';
|
||||||
|
|
||||||
@ -30,6 +35,7 @@ export class NlpEntityService extends BaseService<
|
|||||||
> {
|
> {
|
||||||
constructor(
|
constructor(
|
||||||
readonly repository: NlpEntityRepository,
|
readonly repository: NlpEntityRepository,
|
||||||
|
@Inject(CACHE_MANAGER) private readonly cacheManager: Cache,
|
||||||
private readonly nlpValueService: NlpValueService,
|
private readonly nlpValueService: NlpValueService,
|
||||||
) {
|
) {
|
||||||
super(repository);
|
super(repository);
|
||||||
@ -46,6 +52,28 @@ export class NlpEntityService extends BaseService<
|
|||||||
return await this.repository.deleteOne(id);
|
return await this.repository.deleteOne(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Updates the `weight` field of a specific NLP entity by its ID.
|
||||||
|
*
|
||||||
|
* This method is part of the NLP-based blocks prioritization strategy.
|
||||||
|
* The weight influences the scoring of blocks when multiple blocks match a user's input.
|
||||||
|
* @param id - The unique identifier of the entity to update.
|
||||||
|
* @param updatedWeight - The new weight to assign. Must be a positive integer.
|
||||||
|
* @throws Error if the weight is not a positive integer.
|
||||||
|
* @returns A promise that resolves to the updated entity.
|
||||||
|
*/
|
||||||
|
async updateWeight(id: string, updatedWeight: number): Promise<NlpEntity> {
|
||||||
|
if (!Number.isInteger(updatedWeight) || updatedWeight < 1) {
|
||||||
|
throw new Error('Weight must be a positive integer');
|
||||||
|
}
|
||||||
|
|
||||||
|
return await this.repository.updateOne(
|
||||||
|
id,
|
||||||
|
{ weight: updatedWeight },
|
||||||
|
{ new: true },
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Stores new entities based on the sample text and sample entities.
|
* Stores new entities based on the sample text and sample entities.
|
||||||
* Deletes all values relative to this entity before deleting the entity itself.
|
* Deletes all values relative to this entity before deleting the entity itself.
|
||||||
@ -97,4 +125,49 @@ export class NlpEntityService extends BaseService<
|
|||||||
);
|
);
|
||||||
return Promise.all(findOrCreate);
|
return Promise.all(findOrCreate);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clears the NLP map cache
|
||||||
|
*/
|
||||||
|
async clearCache() {
|
||||||
|
await this.cacheManager.del(NLP_MAP_CACHE_KEY);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Event handler for Nlp Entity updates. Listens to 'hook:nlpEntity:*' events
|
||||||
|
* and invalidates the cache for nlp entities when triggered.
|
||||||
|
*/
|
||||||
|
@OnEvent('hook:nlpEntity:*')
|
||||||
|
async handleNlpEntityUpdateEvent() {
|
||||||
|
this.clearCache();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Event handler for Nlp Value updates. Listens to 'hook:nlpValue:*' events
|
||||||
|
* and invalidates the cache for nlp values when triggered.
|
||||||
|
*/
|
||||||
|
@OnEvent('hook:nlpValue:*')
|
||||||
|
async handleNlpValueUpdateEvent() {
|
||||||
|
this.clearCache();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Retrieves NLP entity lookup information for the given list of entity names.
|
||||||
|
*
|
||||||
|
* This method queries the database for lookups that match any of the provided
|
||||||
|
* entity names, transforms the result into a map structure where each key is
|
||||||
|
* the entity name and each value contains metadata (id, weight, and list of values),
|
||||||
|
* and caches the result using the configured cache key.
|
||||||
|
*
|
||||||
|
* @param entityNames - Array of entity names to retrieve lookup data for.
|
||||||
|
* @returns A Promise that resolves to a map of entity name to its corresponding lookup metadata.
|
||||||
|
*/
|
||||||
|
@Cacheable(NLP_MAP_CACHE_KEY)
|
||||||
|
async getNlpMap(): Promise<NlpCacheMap> {
|
||||||
|
const entities = await this.findAllAndPopulate();
|
||||||
|
return entities.reduce((acc, curr) => {
|
||||||
|
acc.set(curr.name, curr);
|
||||||
|
return acc;
|
||||||
|
}, new Map());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
import { MongooseModule } from '@nestjs/mongoose';
|
import { MongooseModule } from '@nestjs/mongoose';
|
||||||
|
|
||||||
import { LanguageRepository } from '@/i18n/repositories/language.repository';
|
import { LanguageRepository } from '@/i18n/repositories/language.repository';
|
||||||
@ -76,6 +77,12 @@ describe('NlpSampleEntityService', () => {
|
|||||||
NlpSampleEntityService,
|
NlpSampleEntityService,
|
||||||
NlpEntityService,
|
NlpEntityService,
|
||||||
NlpValueService,
|
NlpValueService,
|
||||||
|
{
|
||||||
|
provide: CACHE_MANAGER,
|
||||||
|
useValue: {
|
||||||
|
del: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[
|
[
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
* 2. All derivative works must include clear attribution to the original creator and software, Hexastack and Hexabot, in a prominent location (e.g., in the software's "About" section, documentation, and README file).
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { CACHE_MANAGER } from '@nestjs/cache-manager';
|
||||||
import { MongooseModule } from '@nestjs/mongoose';
|
import { MongooseModule } from '@nestjs/mongoose';
|
||||||
|
|
||||||
import { BaseSchema } from '@/utils/generics/base-schema';
|
import { BaseSchema } from '@/utils/generics/base-schema';
|
||||||
@ -58,6 +59,12 @@ describe('NlpValueService', () => {
|
|||||||
NlpEntityRepository,
|
NlpEntityRepository,
|
||||||
NlpValueService,
|
NlpValueService,
|
||||||
NlpEntityService,
|
NlpEntityService,
|
||||||
|
{
|
||||||
|
provide: CACHE_MANAGER,
|
||||||
|
useValue: {
|
||||||
|
del: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
[
|
[
|
||||||
|
@ -18,3 +18,5 @@ export const LANGUAGES_CACHE_KEY = 'languages';
|
|||||||
export const DEFAULT_LANGUAGE_CACHE_KEY = 'default_language';
|
export const DEFAULT_LANGUAGE_CACHE_KEY = 'default_language';
|
||||||
|
|
||||||
export const ALLOWED_ORIGINS_CACHE_KEY = 'allowed_origins';
|
export const ALLOWED_ORIGINS_CACHE_KEY = 'allowed_origins';
|
||||||
|
|
||||||
|
export const NLP_MAP_CACHE_KEY = 'nlp_map';
|
||||||
|
5
api/src/utils/test/fixtures/nlpentity.ts
vendored
5
api/src/utils/test/fixtures/nlpentity.ts
vendored
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -17,18 +17,21 @@ export const nlpEntityFixtures: NlpEntityCreateDto[] = [
|
|||||||
lookups: ['trait'],
|
lookups: ['trait'],
|
||||||
doc: '',
|
doc: '',
|
||||||
builtin: false,
|
builtin: false,
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: 'first_name',
|
name: 'first_name',
|
||||||
lookups: ['keywords'],
|
lookups: ['keywords'],
|
||||||
doc: '',
|
doc: '',
|
||||||
builtin: false,
|
builtin: false,
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: 'built_in',
|
name: 'built_in',
|
||||||
lookups: ['trait'],
|
lookups: ['trait'],
|
||||||
doc: '',
|
doc: '',
|
||||||
builtin: true,
|
builtin: true,
|
||||||
|
weight: 1,
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
|
@ -16,7 +16,7 @@ import { ButtonType, PayloadType } from '@/chat/schemas/types/button';
|
|||||||
import { CaptureVar } from '@/chat/schemas/types/capture-var';
|
import { CaptureVar } from '@/chat/schemas/types/capture-var';
|
||||||
import { OutgoingMessageFormat } from '@/chat/schemas/types/message';
|
import { OutgoingMessageFormat } from '@/chat/schemas/types/message';
|
||||||
import { BlockOptions, ContentOptions } from '@/chat/schemas/types/options';
|
import { BlockOptions, ContentOptions } from '@/chat/schemas/types/options';
|
||||||
import { Pattern } from '@/chat/schemas/types/pattern';
|
import { NlpPattern, Pattern } from '@/chat/schemas/types/pattern';
|
||||||
import { QuickReplyType } from '@/chat/schemas/types/quick-reply';
|
import { QuickReplyType } from '@/chat/schemas/types/quick-reply';
|
||||||
|
|
||||||
import { modelInstance } from './misc';
|
import { modelInstance } from './misc';
|
||||||
@ -246,6 +246,121 @@ export const blockGetStarted = {
|
|||||||
message: ['Welcome! How are you ? '],
|
message: ['Welcome! How are you ? '],
|
||||||
} as unknown as BlockFull;
|
} as unknown as BlockFull;
|
||||||
|
|
||||||
|
export const mockNlpPatternsSetOne: NlpPattern[] = [
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
match: 'value',
|
||||||
|
value: 'greeting',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'value',
|
||||||
|
value: 'jhon',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export const mockNlpPatternsSetTwo: NlpPattern[] = [
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
match: 'value',
|
||||||
|
value: 'affirmation',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'value',
|
||||||
|
value: 'mark',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export const mockNlpPatternsSetThree: NlpPattern[] = [
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
match: 'value',
|
||||||
|
value: 'greeting',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'entity',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export const mockNlpBlock: BlockFull = {
|
||||||
|
...baseBlockInstance,
|
||||||
|
name: 'Mock Nlp',
|
||||||
|
patterns: [
|
||||||
|
'Hello',
|
||||||
|
'/we*lcome/',
|
||||||
|
{ label: 'Mock Nlp', value: 'MOCK_NLP' },
|
||||||
|
|
||||||
|
mockNlpPatternsSetOne,
|
||||||
|
[
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
match: 'value',
|
||||||
|
value: 'greeting',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'value',
|
||||||
|
value: 'doe',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
],
|
||||||
|
|
||||||
|
trigger_labels: customerLabelsMock,
|
||||||
|
message: ['Good to see you again '],
|
||||||
|
} as unknown as BlockFull;
|
||||||
|
|
||||||
|
export const mockModifiedNlpBlock: BlockFull = {
|
||||||
|
...baseBlockInstance,
|
||||||
|
name: 'Modified Mock Nlp',
|
||||||
|
patterns: [
|
||||||
|
'Hello',
|
||||||
|
'/we*lcome/',
|
||||||
|
{ label: 'Modified Mock Nlp', value: 'MODIFIED_MOCK_NLP' },
|
||||||
|
mockNlpPatternsSetThree,
|
||||||
|
],
|
||||||
|
trigger_labels: customerLabelsMock,
|
||||||
|
message: ['Hello there'],
|
||||||
|
} as unknown as BlockFull;
|
||||||
|
|
||||||
|
export const mockModifiedNlpBlockOne: BlockFull = {
|
||||||
|
...baseBlockInstance,
|
||||||
|
name: 'Modified Mock Nlp One',
|
||||||
|
patterns: [
|
||||||
|
'Hello',
|
||||||
|
'/we*lcome/',
|
||||||
|
{ label: 'Modified Mock Nlp One', value: 'MODIFIED_MOCK_NLP_ONE' },
|
||||||
|
mockNlpPatternsSetTwo,
|
||||||
|
[
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'entity',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
],
|
||||||
|
trigger_labels: customerLabelsMock,
|
||||||
|
message: ['Hello Sir'],
|
||||||
|
} as unknown as BlockFull;
|
||||||
|
|
||||||
|
export const mockModifiedNlpBlockTwo: BlockFull = {
|
||||||
|
...baseBlockInstance,
|
||||||
|
name: 'Modified Mock Nlp Two',
|
||||||
|
patterns: [
|
||||||
|
'Hello',
|
||||||
|
'/we*lcome/',
|
||||||
|
{ label: 'Modified Mock Nlp Two', value: 'MODIFIED_MOCK_NLP_TWO' },
|
||||||
|
[
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
match: 'entity',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
mockNlpPatternsSetThree,
|
||||||
|
],
|
||||||
|
trigger_labels: customerLabelsMock,
|
||||||
|
message: ['Hello Madam'],
|
||||||
|
} as unknown as BlockFull;
|
||||||
const patternsProduct: Pattern[] = [
|
const patternsProduct: Pattern[] = [
|
||||||
'produit',
|
'produit',
|
||||||
[
|
[
|
||||||
@ -285,3 +400,5 @@ export const blockCarouselMock = {
|
|||||||
} as unknown as BlockFull;
|
} as unknown as BlockFull;
|
||||||
|
|
||||||
export const blocks: BlockFull[] = [blockGetStarted, blockEmpty];
|
export const blocks: BlockFull[] = [blockGetStarted, blockEmpty];
|
||||||
|
|
||||||
|
export const nlpBlocks: BlockFull[] = [blockGetStarted, mockNlpBlock];
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -7,6 +7,7 @@
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
import { NLU } from '@/helper/types';
|
import { NLU } from '@/helper/types';
|
||||||
|
import { NlpCacheMap } from '@/nlp/schemas/types';
|
||||||
|
|
||||||
export const nlpEntitiesGreeting: NLU.ParseEntities = {
|
export const nlpEntitiesGreeting: NLU.ParseEntities = {
|
||||||
entities: [
|
entities: [
|
||||||
@ -27,3 +28,52 @@ export const nlpEntitiesGreeting: NLU.ParseEntities = {
|
|||||||
},
|
},
|
||||||
],
|
],
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const mockNlpEntitiesSetOne: NLU.ParseEntities = {
|
||||||
|
entities: [
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
value: 'greeting',
|
||||||
|
confidence: 0.999,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
value: 'jhon',
|
||||||
|
confidence: 0.5,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
export const mockNlpEntitiesSetTwo: NLU.ParseEntities = {
|
||||||
|
entities: [
|
||||||
|
{
|
||||||
|
entity: 'intent',
|
||||||
|
value: 'greeting',
|
||||||
|
confidence: 0.94,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entity: 'firstname',
|
||||||
|
value: 'doe',
|
||||||
|
confidence: 0.33,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
export const mockNlpCacheMap: NlpCacheMap = new Map([
|
||||||
|
[
|
||||||
|
'intent',
|
||||||
|
{
|
||||||
|
id: '67e3e41eff551ca5be70559c',
|
||||||
|
weight: 1,
|
||||||
|
values: ['greeting', 'affirmation'],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
[
|
||||||
|
'firstname',
|
||||||
|
{
|
||||||
|
id: '67e3e41eff551ca5be70559d',
|
||||||
|
weight: 1,
|
||||||
|
values: ['jhon', 'doe'],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
]);
|
||||||
|
@ -121,7 +121,9 @@
|
|||||||
"file_error": "File not found",
|
"file_error": "File not found",
|
||||||
"audio_error": "Audio not found",
|
"audio_error": "Audio not found",
|
||||||
"video_error": "Video not found",
|
"video_error": "Video not found",
|
||||||
"missing_fields_error": "Please make sure that all required fields are filled"
|
"missing_fields_error": "Please make sure that all required fields are filled",
|
||||||
|
"weight_required_error": "Weight is required or invalid",
|
||||||
|
"weight_positive_integer_error": "Weight must be a positive integer"
|
||||||
},
|
},
|
||||||
"menu": {
|
"menu": {
|
||||||
"terms": "Terms of Use",
|
"terms": "Terms of Use",
|
||||||
@ -348,6 +350,7 @@
|
|||||||
"nlp_lookup_trait": "Trait",
|
"nlp_lookup_trait": "Trait",
|
||||||
"doc": "Documentation",
|
"doc": "Documentation",
|
||||||
"builtin": "Built-in?",
|
"builtin": "Built-in?",
|
||||||
|
"weight": "Weight",
|
||||||
"dataset": "Dataset",
|
"dataset": "Dataset",
|
||||||
"yes": "Yes",
|
"yes": "Yes",
|
||||||
"no": "No",
|
"no": "No",
|
||||||
|
@ -121,7 +121,9 @@
|
|||||||
"file_error": "Fichier introuvable",
|
"file_error": "Fichier introuvable",
|
||||||
"audio_error": "Audio introuvable",
|
"audio_error": "Audio introuvable",
|
||||||
"video_error": "Vidéo introuvable",
|
"video_error": "Vidéo introuvable",
|
||||||
"missing_fields_error": "Veuillez vous assurer que tous les champs sont remplis correctement"
|
"missing_fields_error": "Veuillez vous assurer que tous les champs sont remplis correctement",
|
||||||
|
"weight_positive_integer_error": "Le poids doit être un nombre entier positif",
|
||||||
|
"weight_required_error": "Le poids est requis ou bien invalide"
|
||||||
},
|
},
|
||||||
"menu": {
|
"menu": {
|
||||||
"terms": "Conditions d'utilisation",
|
"terms": "Conditions d'utilisation",
|
||||||
@ -347,6 +349,7 @@
|
|||||||
"nlp_lookup_trait": "Trait",
|
"nlp_lookup_trait": "Trait",
|
||||||
"synonyms": "Synonymes",
|
"synonyms": "Synonymes",
|
||||||
"doc": "Documentation",
|
"doc": "Documentation",
|
||||||
|
"weight": "Poids",
|
||||||
"builtin": "Intégré?",
|
"builtin": "Intégré?",
|
||||||
"dataset": "Données",
|
"dataset": "Données",
|
||||||
"yes": "Oui",
|
"yes": "Oui",
|
||||||
|
@ -156,8 +156,7 @@ function StackComponent<T extends GridValidRowModel>({
|
|||||||
disabled={
|
disabled={
|
||||||
(isDisabled && isDisabled(params.row)) ||
|
(isDisabled && isDisabled(params.row)) ||
|
||||||
(params.row.builtin &&
|
(params.row.builtin &&
|
||||||
(requires.includes(PermissionAction.UPDATE) ||
|
requires.includes(PermissionAction.DELETE))
|
||||||
requires.includes(PermissionAction.DELETE)))
|
|
||||||
}
|
}
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
action && action(params.row);
|
action && action(params.row);
|
||||||
|
@ -167,6 +167,16 @@ const NlpEntity = () => {
|
|||||||
resizable: false,
|
resizable: false,
|
||||||
renderHeader,
|
renderHeader,
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
maxWidth: 210,
|
||||||
|
field: "weight",
|
||||||
|
headerName: t("label.weight"),
|
||||||
|
renderCell: (val) => <Chip label={val.value} variant="title" />,
|
||||||
|
sortable: true,
|
||||||
|
disableColumnMenu: true,
|
||||||
|
resizable: false,
|
||||||
|
renderHeader,
|
||||||
|
},
|
||||||
{
|
{
|
||||||
maxWidth: 90,
|
maxWidth: 90,
|
||||||
field: "builtin",
|
field: "builtin",
|
||||||
|
@ -60,6 +60,7 @@ export const NlpEntityVarForm: FC<ComponentFormProps<INlpEntity>> = ({
|
|||||||
name: nlpEntity?.name || "",
|
name: nlpEntity?.name || "",
|
||||||
doc: nlpEntity?.doc || "",
|
doc: nlpEntity?.doc || "",
|
||||||
lookups: nlpEntity?.lookups || ["keywords"],
|
lookups: nlpEntity?.lookups || ["keywords"],
|
||||||
|
weight: nlpEntity?.weight || 1,
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
const validationRules = {
|
const validationRules = {
|
||||||
@ -82,6 +83,7 @@ export const NlpEntityVarForm: FC<ComponentFormProps<INlpEntity>> = ({
|
|||||||
reset({
|
reset({
|
||||||
name: nlpEntity.name,
|
name: nlpEntity.name,
|
||||||
doc: nlpEntity.doc,
|
doc: nlpEntity.doc,
|
||||||
|
weight: nlpEntity.weight,
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
reset();
|
reset();
|
||||||
@ -121,6 +123,7 @@ export const NlpEntityVarForm: FC<ComponentFormProps<INlpEntity>> = ({
|
|||||||
required
|
required
|
||||||
autoFocus
|
autoFocus
|
||||||
helperText={errors.name ? errors.name.message : null}
|
helperText={errors.name ? errors.name.message : null}
|
||||||
|
disabled={nlpEntity?.builtin}
|
||||||
/>
|
/>
|
||||||
</ContentItem>
|
</ContentItem>
|
||||||
<ContentItem>
|
<ContentItem>
|
||||||
@ -128,8 +131,35 @@ export const NlpEntityVarForm: FC<ComponentFormProps<INlpEntity>> = ({
|
|||||||
label={t("label.doc")}
|
label={t("label.doc")}
|
||||||
{...register("doc")}
|
{...register("doc")}
|
||||||
multiline={true}
|
multiline={true}
|
||||||
|
disabled={nlpEntity?.builtin}
|
||||||
/>
|
/>
|
||||||
</ContentItem>
|
</ContentItem>
|
||||||
|
<ContentItem>
|
||||||
|
<Input
|
||||||
|
label={t("label.weight")}
|
||||||
|
{...register("weight", {
|
||||||
|
valueAsNumber: true,
|
||||||
|
required: t("message.weight_required_error"),
|
||||||
|
min: {
|
||||||
|
value: 1,
|
||||||
|
message: t("message.weight_positive_integer_error"),
|
||||||
|
},
|
||||||
|
validate: (value) =>
|
||||||
|
value && Number.isInteger(value) && value! > 0
|
||||||
|
? true
|
||||||
|
: t("message.weight_positive_integer_error"),
|
||||||
|
})}
|
||||||
|
type="number"
|
||||||
|
inputProps={{
|
||||||
|
min: 1,
|
||||||
|
step: 1,
|
||||||
|
inputMode: "numeric",
|
||||||
|
pattern: "[1-9][0-9]*",
|
||||||
|
}}
|
||||||
|
error={!!errors.weight}
|
||||||
|
helperText={errors.weight?.message}
|
||||||
|
/>
|
||||||
|
</ContentItem>
|
||||||
</ContentContainer>
|
</ContentContainer>
|
||||||
</form>
|
</form>
|
||||||
</Wrapper>
|
</Wrapper>
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
/*
|
/*
|
||||||
* Copyright © 2024 Hexastack. All rights reserved.
|
* Copyright © 2025 Hexastack. All rights reserved.
|
||||||
*
|
*
|
||||||
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
* Licensed under the GNU Affero General Public License v3.0 (AGPLv3) with the following additional terms:
|
||||||
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
* 1. The name "Hexabot" is a trademark of Hexastack. You may not use this name in derivative works without express written permission.
|
||||||
@ -19,6 +19,7 @@ export interface INlpEntityAttributes {
|
|||||||
lookups: Lookup[];
|
lookups: Lookup[];
|
||||||
doc?: string;
|
doc?: string;
|
||||||
builtin?: boolean;
|
builtin?: boolean;
|
||||||
|
weight?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
export enum NlpLookups {
|
export enum NlpLookups {
|
||||||
|
Loading…
Reference in New Issue
Block a user