Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
/n8nPublic

feat: AI Workflow Builder core (no-changelog)#17423

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
OlegIvaniv wants to merge15 commits intomaster
base:master
Choose a base branch
Loading
fromai-1107-ai-workflow-builder-core

Conversation

OlegIvaniv
Copy link
Contributor

@OlegIvanivOlegIvaniv commentedJul 17, 2025
edited
Loading

AI Workflow Builder: Core Agent Architecture Refactoring (Part 1/3)

Overview

This PR introduces fundamental architectural changes to the AI Workflow Builder, refactoring from a sequential chain-based approach to a flexible agent-based architecture. This is the first of three PRs that need to be merged to fully implement the new system.

Architecture Overview

The new architecture centers around a LangGraph state machine that orchestrates tool execution and state management:

graph TD    START([User Message]) --> AGENT[Agent<br/>Decides Actions]    AGENT --> TOOLS{Tools Needed?}    TOOLS -->|Yes| PARALLEL[Execute Tools<br/>in Parallel]    TOOLS -->|No| RESPOND[Generate Response]        PARALLEL --> OPS[Collect Operations]    OPS --> PROCESS[Process Operations<br/>Update Workflow]    PROCESS --> AGENT        RESPOND --> END([Stream to Frontend])        style AGENT fill:#e3f2fd    style PARALLEL fill:#fff3e0    style PROCESS fill:#f3e5f5
Loading

Key Changes

1. Core Agent Implementation

The refactoring replaces sequential chain execution with a stateful LangGraph agent that:

  • Executes multiple tools in parallel for better performance
  • Maintains conversation state across interactions
  • Handles iterative workflow building through a feedback loop
  • Recovers gracefully from errors without losing context

Implementation:packages/@n8n/ai-workflow-builder.ee/src/workflow-builder-agent.ts

2. Tool System Implementation

Six specialized tools replace the previous chain components:

  • node-search: Fuzzy search across available node types
  • node-details: Retrieve detailed parameter schemas for nodes
  • add-node: Add nodes with intelligent positioning and naming
  • connect-nodes: Create connections with proper output handling
  • remove-node: Safely remove nodes and their connections
  • update-node-parameters: Configure node parameters using dedicated LLM chain

Tools execute in parallel when possible, improving response times for complex operations.

3. State Management and Operations Queue

The system uses an operations queue pattern to ensure atomic state updates:

  • State Management:packages/@n8n/ai-workflow-builder.ee/src/workflow-state.ts
  • Operations Processing:packages/@n8n/ai-workflow-builder.ee/src/utils/operations-processor.ts

Key features:

  • Tools return operations instead of directly mutating state
  • Operations accumulate from parallel tool execution
  • Processing applies operations atomically in correct order
  • Enables future undo/redo capabilities

4. Message Streaming and Real-time Updates

The streaming architecture provides real-time feedback:

  • Stream Processor:packages/@n8n/ai-workflow-builder.ee/src/utils/stream-processor.ts
  • Progress Tracking:packages/@n8n/ai-workflow-builder.ee/src/tools/helpers/progress.ts

Stream modes:

  • updates: State changes and agent messages
  • custom: Tool execution progress

5. Session Management

Conversation persistence through LangGraph's MemorySaver:

  • Thread format:workflow-{workflowId}-user-{userId}
  • Commands:/clear (reset session),/compact (summarize conversation)
  • Automatic session restoration on reconnection

6. Supporting Chains

Three chains enhance the agent's capabilities:

  1. Parameter Updater: Dynamic prompt assembly for node configuration based on natural language array of instructions
  2. Conversation Compact: Reduces conversation size while preserving context
  3. Planner: Legacy chain kept for future enhancements

7. Error Handling

Error handling maintains conversation flow:

  • Structured error types (ValidationError, NodeTypeNotFoundError)
  • Tool-level error isolation prevents cascade failures
  • Errors returned as messages for graceful degradation

Technical Deep Dive

1. LangGraph Agent State Machine

The agent orchestrates all operations through a state machine:

graph TD    START([Start]) --> CHECK{Check Command}    CHECK -->|"/clear"| DELETE[delete_messages]    CHECK -->|"/compact"| COMPACT[compact_messages]    CHECK -->|"regular message"| AGENT[agent]    AGENT --> DECIDE{Has Tool Calls?}    DECIDE -->|Yes| TOOLS[tools<br/>Parallel Execution]    DECIDE -->|No| END([End])    TOOLS --> PROCESS[process_operations<br/>Apply Mutations]    PROCESS --> AGENT    DELETE --> END    COMPACT --> END    style AGENT fill:#e3f2fd    style TOOLS fill:#fff3e0    style PROCESS fill:#f3e5f5    style DELETE fill:#ffebee    style COMPACT fill:#e8f5e9
Loading

State Flow Details:

  • agent: Calls LLM with available tools, decides next actions
  • tools: Executes multiple tools in parallel for efficiency
  • process_operations: Applies accumulated operations to workflow
  • delete_messages: Clears conversation history
  • compact_messages: Summarizes long conversations
  • See:packages/@n8n/ai-workflow-builder.ee/src/workflow-builder-agent.ts

2. State Mutations Through Operations Queue

The system ensures atomic state updates through an operations queue pattern:

stateDiagram-v2    [*] --> ToolExecution: Tools Execute    state ToolExecution {        [*] --> Parallel        Parallel --> Tool1: add_nodes        Parallel --> Tool2: connect_nodes        Parallel --> Tool3: update_parameters        Tool1 --> Op1: Returns Operations        Tool2 --> Op2: Returns Operations        Tool3 --> Op3: Returns Operations    }    ToolExecution --> OperationsQueue: Collect All Operations    state OperationsQueue {        Queue: Operations Array        Queue --> Reducer: operationsReducer        Reducer --> Merged: Merged Operations    }    OperationsQueue --> ProcessNode: process_operations    state ProcessNode {        Apply: applyOperations()        Apply --> Clear: addNodes        Clear --> Update: removeNode        Update --> Connect: updateNode        Connect --> Merge: mergeConnections    }    ProcessNode --> WorkflowJSON: Updated Workflow    ProcessNode --> ClearQueue: operations = null    WorkflowJSON --> [*]
Loading

Key Concepts:

  • Tools return operations, not direct mutations
  • Operations queue accumulates changes from parallel tool execution
  • Reducer handles special cases (clear operations, append logic)
  • Process node applies all operations atomically
  • Operations cleared after successful processing
  • See:packages/@n8n/ai-workflow-builder.ee/src/workflow-state.ts
  • See:packages/@n8n/ai-workflow-builder.ee/src/utils/operations-processor.ts

3. Stream Processing and Real-time Updates

The system provides real-time feedback through streaming:

flowchart TD    subgraph Tool Execution        T1[Tool Invoked] --> PR[Progress Reporter]        PR --> E1[Starting...]        PR --> E2[Validating node type...]        PR --> E3[Creating connections...]        PR --> E4[Success/Error]    end    subgraph Stream Processing        E1 --> CE[Custom Events]        E2 --> CE        E3 --> CE        E4 --> CE        CE --> SP[Stream Processor]        SP --> MODE{Stream Mode?}        MODE -->|updates| AU[Agent Updates]        MODE -->|custom| TP[Tool Progress]        AU --> AM[Agent Messages]        AU --> WU[Workflow Updates]        TP --> TM[Tool Messages]    end    subgraph Frontend Display        AM --> Chat[Chat UI]        TM --> Tool[Tool Status UI]        WU --> Canvas[Workflow Canvas]    end    style PR fill:#e8f5e9    style SP fill:#fff3e0    style MODE fill:#ffebee
Loading

Implementation Flow:

  1. Tools report progress viaprogressReporter.progress()
  2. Progress events stream as custom events
  3. Stream processor routes events based on type
  4. Frontend receives typed messages for UI updates
  5. Tool messages show real-time execution status
  • See:packages/@n8n/ai-workflow-builder.ee/src/utils/stream-processor.ts
  • See:packages/@n8n/ai-workflow-builder.ee/src/tools/helpers/progress.ts

4. Parameter Update Chain and Prompt Caching

The parameter update chain dynamically assembles prompts with caching for efficiency:

flowchart TD    subgraph Cached Components        BASE[Base Instructions<br/>Expression Rules]        NODE[Node Parameters Definitions]        EXAMPLES[Relevant Configuration <br/> Examples]    end        subgraph Dynamic Assembly        REQ[User Request] --> ANALYZE[Analyze Node Type]        ANALYZE --> BUILD[Build Prompt]                BASE --> BUILD        NODE --> BUILD        EXAMPLES --> BUILD                BUILD --> CACHE{Cached?}        CACHE -->|Yes| FAST[Reduced Tokens]        CACHE -->|No| FULL[Full Token Cost]    end        FAST --> LLM[LLM Processing]    FULL --> LLM        LLM --> OUTPUT[Updated Parameters]        style BASE fill:#e8f5e9    style NODE fill:#e8f5e9    style EXAMPLES fill:#e8f5e9
Loading

Key Features:

  • Anthropic prompt caching reduces token costs
  • Node-specific prompts for HTTP, If, Database nodes
  • Dynamic example selection based on context
  • Structured output for reliable parsing
  • See:packages/@n8n/ai-workflow-builder.ee/src/chains/parameter-updater.ts

Integration Points

Service Layer Integration

  • packages/cli/src/services/ai-workflow-builder.service.ts: Manages AI builder lifecycle
  • Handles LLM provider configuration (OpenAI/Anthropic)
  • Integrates with AI Assistant Client for enterprise deployments

API Controller

  • packages/cli/src/controllers/ai.controller.ts: Streaming endpoint
  • JSON Lines format with custom delimiter:⧉⇋⇋➽⌑⧉§§\n
  • Rate limiting: 100 requests per window

Frontend State Management

  • Builder store processes streaming messages
  • Real-time canvas synchronization
  • Tool execution visualization

Review / Merge checklist

  • PR title and summary are descriptive. (conventions)
  • Docs updated or follow-up ticket created.
  • Tests included.
  • PR Labeled withrelease/backport (if the PR is an urgent fix that needs to be backported)

MrASquare reacted with thumbs up emoji
…-changelog)- Replace chain-based system with tool-based agent architecture- Add tools for node operations (add, connect, remove, update)- Implement new prompt engineering system- Add node search engine with fuzzy matching- Add tests
@OlegIvanivOlegIvaniv requested review froma team andburivuhster and removed request fora teamJuly 17, 2025 18:48
@n8n-assistantn8n-assistantbot added the n8n teamAuthored by the n8n team labelJul 17, 2025
Copy link
Contributor

@cubic-dev-aicubic-dev-aibot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

cubic found 15 issues across 77 files. Review them incubic.dev

React with 👍 or 👎 to teach cubic. Tag@cubic-dev-ai to give specific feedback.

/**
* Count connections that will be removed for a node
*/
functioncountNodeConnections(nodeId:string,connections:IConnections):number{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Outgoing connections to the same node are counted once in the “outgoing” loop and again in the “incoming” loop, so self-loop connections will be double-counted, leading to an incorrectconnectionsRemoved value that is shown to users.

Prompt for AI agents
Address the following comment on packages/@n8n/ai-workflow-builder.ee/src/tools/remove-node.tool.ts at line 23:<comment>Outgoing connections to the same node are counted once in the “outgoing” loop and again in the “incoming” loop, so self-loop connections will be double-counted, leading to an incorrect `connectionsRemoved` value that is shown to users.</comment><file context>@@ -0,0 +1,155 @@+import { tool } from &#39;@langchain/core/tools&#39;;+import type { Logger } from &#39;@n8n/backend-common&#39;;+import type { IConnections } from &#39;n8n-workflow&#39;;+import { z } from &#39;zod&#39;;++import { ValidationError, ToolExecutionError } from &#39;../errors&#39;;+import { createProgressReporter, reportProgress } from &#39;./helpers/progress&#39;;+import { createSuccessResponse, createErrorResponse } from &#39;./helpers/response&#39;;+import { getCurrentWorkflow, getWorkflowState, removeNodeFromWorkflow } from &#39;./helpers/state&#39;;+import { validateNodeExists, createNodeNotFoundError } from &#39;./helpers/validation&#39;;+import type { RemoveNodeOutput } from &#39;../types/tools&#39;;++/**+ * Schema for the remove node tool+ */+const removeNodeSchema = z.object({+nodeId: z.string().describe(&#39;The ID of the node to remove from the workflow&#39;),+});++/**+ * Count connections that will be removed for a node+ */+function countNodeConnections(nodeId: string, connections: IConnections): number {+let count = 0;++// Count outgoing connections+if (connections[nodeId]) {+for (const connectionType of Object.values(connections[nodeId])) {+if (Array.isArray(connectionType)) {+for (const outputs of connectionType) {+if (Array.isArray(outputs)) {+count += outputs.length;+}+}+}+}+}++// Count incoming connections+for (const [_sourceNodeId, nodeConnections] of Object.entries(connections)) {+for (const outputs of Object.values(nodeConnections)) {+if (Array.isArray(outputs)) {+for (const outputConnections of outputs) {+if (Array.isArray(outputConnections)) {+count += outputConnections.filter((conn) =&gt; conn.node === nodeId).length;+}+}+}+}+}++return count;+}++/**+ * Build the response message for the removed node+ */+function buildResponseMessage(+nodeName: string,+nodeType: string,+connectionsRemoved: number,+): string {+const parts: string[] = [`Successfully removed node &quot;${nodeName}&quot; (${nodeType})`];++if (connectionsRemoved &gt; 0) {+parts.push(`Removed ${connectionsRemoved} connection${connectionsRemoved &gt; 1 ? &#39;s&#39; : &#39;&#39;}`);+}++return parts.join(&#39;\n&#39;);+}++/**+ * Factory function to create the remove node tool+ */+export function createRemoveNodeTool(_logger?: Logger) {+return tool(+(input, config) =&gt; {+const reporter = createProgressReporter(config, &#39;remove_node&#39;);++try {+// Validate input using Zod schema+const validatedInput = removeNodeSchema.parse(input);+const { nodeId } = validatedInput;++// Report tool start+reporter.start(validatedInput);++// Get current state+const state = getWorkflowState();+const workflow = getCurrentWorkflow(state);++// Report progress+reportProgress(reporter, `Removing node ${nodeId}`);++// Find the node to remove+const nodeToRemove = validateNodeExists(nodeId, workflow.nodes);++if (!nodeToRemove) {+const error = createNodeNotFoundError(nodeId);+reporter.error(error);+return createErrorResponse(config, error);+}++// Count connections that will be removed+const connectionsRemoved = countNodeConnections(nodeId, workflow.connections);++// Build success message+const message = buildResponseMessage(+nodeToRemove.name,+nodeToRemove.type,+connectionsRemoved,+);++// Report completion+const output: RemoveNodeOutput = {+removedNodeId: nodeId,+removedNodeName: nodeToRemove.name,+removedNodeType: nodeToRemove.type,+connectionsRemoved,+message,+};+reporter.complete(output);++// Return success with state updates+const stateUpdates = removeNodeFromWorkflow(nodeId);+return createSuccessResponse(config, message, stateUpdates);+} catch (error) {+// Handle validation or unexpected errors+if (error instanceof z.ZodError) {+const validationError = new ValidationError(&#39;Invalid input parameters&#39;, {+extra: { errors: error.errors },+});+reporter.error(validationError);+return createErrorResponse(config, validationError);+}++const toolError = new ToolExecutionError(+error instanceof Error ? error.message : &#39;Unknown error occurred&#39;,+{+toolName: &#39;remove_node&#39;,+cause: error instanceof Error ? error : undefined,+},+);+reporter.error(toolError);+return createErrorResponse(config, toolError);+}+},+{+name: &#39;remove_node&#39;,+description:+&#39;Remove a node from the workflow by its ID. This will also remove all connections to and from the node. Use this tool when you need to delete a node that is no longer needed in the workflow.&#39;,+schema: removeNodeSchema,+},+);+}</file context>

*/
constsearchQuerySchema=z.object({
queryType:z.enum(['name','subNodeSearch']).describe('Type of search to perform'),
query:z.string().optional().describe('Search term to filter results'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

The validation schema allows queryType "name" while leaving query undefined, but processQuery later treats a missing query as an error case that silently returns no results. This leads to silent failures that should instead be caught during validation.

Prompt for AI agents
Address the following comment on packages/@n8n/ai-workflow-builder.ee/src/tools/node-search.tool.ts at line 17:<comment>The validation schema allows queryType &quot;name&quot; while leaving query undefined, but processQuery later treats a missing query as an error case that silently returns no results. This leads to silent failures that should instead be caught during validation.</comment><file context>@@ -0,0 +1,213 @@+import { tool } from &#39;@langchain/core/tools&#39;;+import { NodeConnectionTypes, type INodeTypeDescription } from &#39;n8n-workflow&#39;;+import { z } from &#39;zod&#39;;++import { ValidationError, ToolExecutionError } from &#39;../errors&#39;;+import { NodeSearchEngine } from &#39;./engines/node-search-engine&#39;;+import { createProgressReporter, createBatchProgressReporter } from &#39;./helpers/progress&#39;;+import { createSuccessResponse, createErrorResponse } from &#39;./helpers/response&#39;;+import type { NodeSearchResult } from &#39;../types/nodes&#39;;+import type { NodeSearchOutput } from &#39;../types/tools&#39;;++/**+ * Search query schema - simplified for better LLM compatibility+ */+const searchQuerySchema = z.object({+queryType: z.enum([&#39;name&#39;, &#39;subNodeSearch&#39;]).describe(&#39;Type of search to perform&#39;),+query: z.string().optional().describe(&#39;Search term to filter results&#39;),+connectionType: z+.nativeEnum(NodeConnectionTypes)+.optional()+.describe(&#39;For subNodeSearch: connection type like ai_languageModel, ai_tool, etc.&#39;),+});++/**+ * Main schema for node search tool+ */+const nodeSearchSchema = z.object({+queries: z+.array(searchQuerySchema)+.min(1)+.describe(&#39;Array of search queries to find different types of nodes&#39;),+});++/**+ * Inferred types from schemas+ */+type SearchQuery = z.infer&lt;typeof searchQuerySchema&gt;;++const SEARCH_LIMIT = 5;++/**+ * Process a single search query+ */+function processQuery(+query: SearchQuery,+searchEngine: NodeSearchEngine,+): { searchResults: NodeSearchResult[]; searchIdentifier: string } {+if (query.queryType === &#39;name&#39;) {+// Name-based search+const searchTerm = query.query;+if (!searchTerm) {+return {+searchResults: [],+searchIdentifier: &#39;&#39;,+};+}+const searchResults = searchEngine.searchByName(searchTerm, SEARCH_LIMIT);+return {+searchResults,+searchIdentifier: searchTerm,+};+} else {+// Sub-node search by connection type+const connectionType = query.connectionType;+if (!connectionType) {+return {+searchResults: [],+searchIdentifier: &#39;&#39;,+};+}+const searchResults = searchEngine.searchByConnectionType(+connectionType,+SEARCH_LIMIT,+query.query,+);+const searchIdentifier = query.query+? `sub-nodes with ${connectionType} output matching &quot;${query.query}&quot;`+: `sub-nodes with ${connectionType} output`;+return {+searchResults,+searchIdentifier,+};+}+}++/**+ * Build the response message from search results+ */+function buildResponseMessage(+results: NodeSearchOutput[&#39;results&#39;],+nodeTypes: INodeTypeDescription[],+): string {+const searchEngine = new NodeSearchEngine(nodeTypes);+let responseContent = &#39;&#39;;++for (const { query, results: searchResults } of results) {+if (responseContent) responseContent += &#39;\n\n&#39;;++if (searchResults.length === 0) {+responseContent += `No nodes found matching &quot;${query}&quot;`;+} else {+responseContent += `Found ${searchResults.length} nodes matching &quot;${query}&quot;:${searchResults+.map((node) =&gt; searchEngine.formatResult(node))+.join(&#39;&#39;)}`;+}+}++return responseContent;+}++/**+ * Factory function to create the node search tool+ */+export function createNodeSearchTool(nodeTypes: INodeTypeDescription[]) {+return tool(+(input: unknown, config) =&gt; {+const reporter = createProgressReporter(config, &#39;search_nodes&#39;);++try {+// Validate input using Zod schema+const validatedInput = nodeSearchSchema.parse(input);+const { queries } = validatedInput;++// Report tool start+reporter.start(validatedInput);++const allResults: NodeSearchOutput[&#39;results&#39;] = [];++// Create search engine instance+const searchEngine = new NodeSearchEngine(nodeTypes);++// Create batch reporter for progress tracking+const batchReporter = createBatchProgressReporter(reporter, &#39;Searching nodes&#39;);+batchReporter.init(queries.length);++// Process each query+for (const searchQuery of queries) {+const { searchResults, searchIdentifier } = processQuery(searchQuery, searchEngine);++// Report progress+batchReporter.next(searchIdentifier);++// Add to results+allResults.push({+query: searchIdentifier,+results: searchResults,+});+}++// Complete batch reporting+batchReporter.complete();++// Build response message+const responseMessage = buildResponseMessage(allResults, nodeTypes);++// Report completion+const output: NodeSearchOutput = {+results: allResults,+totalResults: allResults.reduce((sum, r) =&gt; sum + r.results.length, 0),+message: responseMessage,+};+reporter.complete(output);++// Return success response+return createSuccessResponse(config, responseMessage);+} catch (error) {+// Handle validation or unexpected errors+if (error instanceof z.ZodError) {+const validationError = new ValidationError(&#39;Invalid input parameters&#39;, {+extra: { errors: error.errors },+});+reporter.error(validationError);+return createErrorResponse(config, validationError);+}++const toolError = new ToolExecutionError(+error instanceof Error ? error.message : &#39;Unknown error occurred&#39;,+{+toolName: &#39;search_nodes&#39;,+cause: error instanceof Error ? error : undefined,+},+);+reporter.error(toolError);+return createErrorResponse(config, toolError);+}+},+{+name: &#39;search_nodes&#39;,+description: `Search for n8n nodes by name or find sub-nodes that output specific connection types. Use this before adding nodes to find the correct node types.++Search modes:+1. Name search (default): Search nodes by name/description+   Example: { queryType: &quot;name&quot;, query: &quot;http&quot; }++2. Sub-node search: Find sub-nodes that output specific AI connection types+   Example: { queryType: &quot;subNodeSearch&quot;, connectionType: NodeConnectionTypes.AiTool }+   With optional query filter: { queryType: &quot;subNodeSearch&quot;, connectionType: NodeConnectionTypes.AiTool, query: &quot;calculator&quot; }+   This finds sub-nodes (like &quot;Calculator Tool&quot;) that can be connected to nodes accepting that connection type++Common AI connection types for sub-node search:+- NodeConnectionTypes.AiLanguageModel (finds LLM provider sub-nodes like &quot;OpenAI Chat Model&quot;)+- NodeConnectionTypes.AiTool (finds tool sub-nodes like &quot;Calculator Tool&quot;, &quot;Code Tool&quot;)+- NodeConnectionTypes.AiMemory (finds memory sub-nodes like &quot;Window Buffer Memory&quot;)+- NodeConnectionTypes.AiEmbedding (finds embedding sub-nodes like &quot;Embeddings OpenAI&quot;)+- NodeConnectionTypes.AiVectorStore (finds vector store sub-nodes)+- NodeConnectionTypes.AiDocument (finds document loader sub-nodes)+- NodeConnectionTypes.AiTextSplitter (finds text splitter sub-nodes)++You can search for multiple different criteria at once by providing an array of queries.`,+schema: nodeSearchSchema,+},+);+}</file context>

typeofgetCurrentTaskInput
>;

jest.mock('@langchain/langgraph',()=>({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

jest.mock is called inside a helper function, so the mock is applied only after the function is executed rather than being hoisted to the top of the module. This can lead to the real module being loaded before the mock is in place, causing unpredictable test behaviour.

Prompt for AI agents
Address the following comment on packages/@n8n/ai-workflow-builder.ee/test/test-utils.ts at line 334:<comment>jest.mock is called inside a helper function, so the mock is applied only after the function is executed rather than being hoisted to the top of the module. This can lead to the real module being loaded before the mock is in place, causing unpredictable test behaviour.</comment><file context>@@ -0,0 +1,601 @@+import type { ToolRunnableConfig } from &#39;@langchain/core/tools&#39;;+import type { LangGraphRunnableConfig } from &#39;@langchain/langgraph&#39;;+import { getCurrentTaskInput } from &#39;@langchain/langgraph&#39;;+import type { MockProxy } from &#39;jest-mock-extended&#39;;+import { mock } from &#39;jest-mock-extended&#39;;+import type {+INode,+INodeTypeDescription,+INodeParameters,+IConnection,+NodeConnectionType,+} from &#39;n8n-workflow&#39;;+import { jsonParse } from &#39;n8n-workflow&#39;;++import type { ProgressReporter, ToolProgressMessage } from &#39;../src/types/tools&#39;;+import type { SimpleWorkflow } from &#39;../src/types/workflow&#39;;++// Mock progress reporter with strong typing+export const mockProgress = (): MockProxy&lt;ProgressReporter&gt; =&gt; mock&lt;ProgressReporter&gt;();++// Mock state helpers - using regular jest mocks for simplicity but with proper typing+export const mockStateHelpers = () =&gt; ({+getNodes: jest.fn(() =&gt; [] as INode[]),+getConnections: jest.fn(() =&gt; ({}) as SimpleWorkflow[&#39;connections&#39;]),+updateNode: jest.fn((_id: string, _updates: Partial&lt;INode&gt;) =&gt; undefined),+addNodes: jest.fn((_nodes: INode[]) =&gt; undefined),+removeNode: jest.fn((_id: string) =&gt; undefined),+addConnections: jest.fn((_connections: IConnection[]) =&gt; undefined),+removeConnection: jest.fn((_sourceId: string, _targetId: string, _type?: string) =&gt; undefined),+});++// Type for our mock state helpers+export type MockStateHelpers = ReturnType&lt;typeof mockStateHelpers&gt;;++// Simple node creation helper+export const createNode = (overrides: Partial&lt;INode&gt; = {}): INode =&gt; ({+id: &#39;node1&#39;,+name: &#39;TestNode&#39;,+type: &#39;n8n-nodes-base.code&#39;,+typeVersion: 1,+position: [0, 0],+...overrides,+// Ensure parameters are properly merged if provided in overrides+parameters: overrides.parameters ?? {},+});++// Simple workflow builder+export const createWorkflow = (nodes: INode[] = []): SimpleWorkflow =&gt; {+const workflow: SimpleWorkflow = { nodes, connections: {} };+return workflow;+};++// Create mock node type description+export const createNodeType = (+overrides: Partial&lt;INodeTypeDescription&gt; = {},+): INodeTypeDescription =&gt; ({+displayName: overrides.displayName ?? &#39;Test Node&#39;,+name: overrides.name ?? &#39;test.node&#39;,+group: overrides.group ?? [&#39;transform&#39;],+version: overrides.version ?? 1,+description: overrides.description ?? &#39;Test node description&#39;,+defaults: overrides.defaults ?? { name: &#39;Test Node&#39; },+inputs: overrides.inputs ?? [&#39;main&#39;],+outputs: overrides.outputs ?? [&#39;main&#39;],+properties: overrides.properties ?? [],+...overrides,+});++// Common node types for testing+export const nodeTypes = {+code: createNodeType({+displayName: &#39;Code&#39;,+name: &#39;n8n-nodes-base.code&#39;,+group: [&#39;transform&#39;],+properties: [+{+displayName: &#39;JavaScript&#39;,+name: &#39;jsCode&#39;,+type: &#39;string&#39;,+typeOptions: {+editor: &#39;codeNodeEditor&#39;,+},+default: &#39;&#39;,+},+],+}),+httpRequest: createNodeType({+displayName: &#39;HTTP Request&#39;,+name: &#39;n8n-nodes-base.httpRequest&#39;,+group: [&#39;input&#39;],+properties: [+{+displayName: &#39;URL&#39;,+name: &#39;url&#39;,+type: &#39;string&#39;,+default: &#39;&#39;,+},+{+displayName: &#39;Method&#39;,+name: &#39;method&#39;,+type: &#39;options&#39;,+options: [+{ name: &#39;GET&#39;, value: &#39;GET&#39; },+{ name: &#39;POST&#39;, value: &#39;POST&#39; },+],+default: &#39;GET&#39;,+},+],+}),+webhook: createNodeType({+displayName: &#39;Webhook&#39;,+name: &#39;n8n-nodes-base.webhook&#39;,+group: [&#39;trigger&#39;],+inputs: [],+outputs: [&#39;main&#39;],+webhooks: [+{+name: &#39;default&#39;,+httpMethod: &#39;POST&#39;,+responseMode: &#39;onReceived&#39;,+path: &#39;webhook&#39;,+},+],+properties: [+{+displayName: &#39;Path&#39;,+name: &#39;path&#39;,+type: &#39;string&#39;,+default: &#39;webhook&#39;,+},+],+}),+agent: createNodeType({+displayName: &#39;AI Agent&#39;,+name: &#39;@n8n/n8n-nodes-langchain.agent&#39;,+group: [&#39;output&#39;],+inputs: [&#39;ai_agent&#39;],+outputs: [&#39;main&#39;],+properties: [],+}),+openAiModel: createNodeType({+displayName: &#39;OpenAI Chat Model&#39;,+name: &#39;@n8n/n8n-nodes-langchain.lmChatOpenAi&#39;,+group: [&#39;output&#39;],+inputs: [],+outputs: [&#39;ai_languageModel&#39;],+properties: [],+}),+setNode: createNodeType({+displayName: &#39;Set&#39;,+name: &#39;n8n-nodes-base.set&#39;,+group: [&#39;transform&#39;],+properties: [+{+displayName: &#39;Values to Set&#39;,+name: &#39;values&#39;,+type: &#39;collection&#39;,+default: {},+},+],+}),+ifNode: createNodeType({+displayName: &#39;If&#39;,+name: &#39;n8n-nodes-base.if&#39;,+group: [&#39;transform&#39;],+inputs: [&#39;main&#39;],+outputs: [&#39;main&#39;, &#39;main&#39;],+outputNames: [&#39;true&#39;, &#39;false&#39;],+properties: [+{+displayName: &#39;Conditions&#39;,+name: &#39;conditions&#39;,+type: &#39;collection&#39;,+default: {},+},+],+}),+mergeNode: createNodeType({+displayName: &#39;Merge&#39;,+name: &#39;n8n-nodes-base.merge&#39;,+group: [&#39;transform&#39;],+inputs: [&#39;main&#39;, &#39;main&#39;],+outputs: [&#39;main&#39;],+inputNames: [&#39;Input 1&#39;, &#39;Input 2&#39;],+properties: [+{+displayName: &#39;Mode&#39;,+name: &#39;mode&#39;,+type: &#39;options&#39;,+options: [+{ name: &#39;Append&#39;, value: &#39;append&#39; },+{ name: &#39;Merge By Index&#39;, value: &#39;mergeByIndex&#39; },+{ name: &#39;Merge By Key&#39;, value: &#39;mergeByKey&#39; },+],+default: &#39;append&#39;,+},+],+}),+vectorStoreNode: createNodeType({+displayName: &#39;Vector Store&#39;,+name: &#39;@n8n/n8n-nodes-langchain.vectorStore&#39;,+subtitle: &#39;={{$parameter[&quot;mode&quot;] === &quot;retrieve&quot; ? &quot;Retrieve&quot; : &quot;Insert&quot;}}&#39;,+group: [&#39;transform&#39;],+inputs: `={{ ((parameter) =&gt; {+function getInputs(parameters) {+const mode = parameters?.mode;+const inputs = [];+if (mode === &#39;retrieve-as-tool&#39;) {+inputs.push({+displayName: &#39;Embedding&#39;,+type: &#39;ai_embedding&#39;,+required: true+});+} else {+inputs.push({+displayName: &#39;&#39;,+type: &#39;main&#39;+});+inputs.push({+displayName: &#39;Embedding&#39;,+type: &#39;ai_embedding&#39;,+required: true+});+}+return inputs;+};+return getInputs(parameter)+})($parameter) }}`,+outputs: `={{ ((parameter) =&gt; {+function getOutputs(parameters) {+const mode = parameters?.mode;+if (mode === &#39;retrieve-as-tool&#39;) {+return [&#39;ai_tool&#39;];+} else if (mode === &#39;retrieve&#39;) {+return [&#39;ai_document&#39;];+} else {+return [&#39;main&#39;];+}+};+return getOutputs(parameter)+})($parameter) }}`,+properties: [+{+displayName: &#39;Mode&#39;,+name: &#39;mode&#39;,+type: &#39;options&#39;,+options: [+{ name: &#39;Insert&#39;, value: &#39;insert&#39; },+{ name: &#39;Retrieve&#39;, value: &#39;retrieve&#39; },+{ name: &#39;Retrieve (As Tool)&#39;, value: &#39;retrieve-as-tool&#39; },+],+default: &#39;insert&#39;,+},+// Many more properties would be here in reality+],+}),+};++// Helper to create connections+export const createConnection = (+_fromId: string,+toId: string,+type: NodeConnectionType = &#39;main&#39;,+index: number = 0,+) =&gt; ({+node: toId,+type,+index,+});++// Generic chain interface+interface Chain&lt;TInput = Record&lt;string, unknown&gt;, TOutput = Record&lt;string, unknown&gt;&gt; {+invoke: (input: TInput) =&gt; Promise&lt;TOutput&gt;;+}++// Generic mock chain factory with proper typing+export const mockChain = &lt;+TInput = Record&lt;string, unknown&gt;,+TOutput = Record&lt;string, unknown&gt;,+&gt;(): MockProxy&lt;Chain&lt;TInput, TOutput&gt;&gt; =&gt; {+return mock&lt;Chain&lt;TInput, TOutput&gt;&gt;();+};++// Convenience factory for parameter updater chain+export const mockParameterUpdaterChain = () =&gt; {+return mockChain&lt;Record&lt;string, unknown&gt;, { parameters: Record&lt;string, unknown&gt; }&gt;();+};++// Helper to assert node parameters+export const expectNodeToHaveParameters = (+node: INode,+expectedParams: Partial&lt;INodeParameters&gt;,+): void =&gt; {+expect(node.parameters).toMatchObject(expectedParams);+};++// Helper to assert connections exist+export const expectConnectionToExist = (+connections: SimpleWorkflow[&#39;connections&#39;],+fromId: string,+toId: string,+type: string = &#39;main&#39;,+): void =&gt; {+expect(connections[fromId]).toBeDefined();+expect(connections[fromId][type]).toBeDefined();+expect(connections[fromId][type]).toContainEqual(+expect.arrayContaining([expect.objectContaining({ node: toId })]),+);+};++// ========== LangGraph Testing Utilities ==========++// Types for mocked Command results+export type MockedCommandResult = { content: string };++// Common parsed content structure for tool results+export interface ParsedToolContent {+update: {+messages: Array&lt;{ kwargs: { content: string } }&gt;;+workflowOperations?: Array&lt;{+type: string;+nodes?: INode[];+[key: string]: unknown;+}&gt;;+};+}++// Setup LangGraph mocks+export const setupLangGraphMocks = () =&gt; {+const mockGetCurrentTaskInput = getCurrentTaskInput as jest.MockedFunction&lt;+typeof getCurrentTaskInput+&gt;;++jest.mock(&#39;@langchain/langgraph&#39;, () =&gt; ({+getCurrentTaskInput: jest.fn(),+Command: jest.fn().mockImplementation((params: Record&lt;string, unknown&gt;) =&gt; ({+content: JSON.stringify(params),+})),+}));++return { mockGetCurrentTaskInput };+};++// Parse tool result with double-wrapped content handling+export const parseToolResult = &lt;T = ParsedToolContent&gt;(result: unknown): T =&gt; {+const parsed = jsonParse&lt;{ content?: string }&gt;((result as MockedCommandResult).content);+return parsed.content ? jsonParse&lt;T&gt;(parsed.content) : (parsed as T);+};++// ========== Progress Message Utilities ==========++// Extract progress messages from mockWriter+export const extractProgressMessages = (+mockWriter: jest.Mock,+): Array&lt;ToolProgressMessage&lt;string&gt;&gt; =&gt; {+const progressCalls: Array&lt;ToolProgressMessage&lt;string&gt;&gt; = [];++mockWriter.mock.calls.forEach((call) =&gt; {+// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment+const [arg] = call;+progressCalls.push(arg as ToolProgressMessage&lt;string&gt;);+});+return progressCalls;+};++// Find specific progress message by type+export const findProgressMessage = (+messages: Array&lt;ToolProgressMessage&lt;string&gt;&gt;,+status: &#39;running&#39; | &#39;completed&#39; | &#39;error&#39;,+updateType?: string,+): ToolProgressMessage&lt;string&gt; | undefined =&gt; {+return messages.find(+(msg) =&gt; msg.status === status &amp;&amp; (!updateType || msg.updates[0]?.type === updateType),+);+};++// ========== Tool Config Helpers ==========++// Create basic tool config+export const createToolConfig = (+toolName: string,+callId: string = &#39;test-call&#39;,+): ToolRunnableConfig =&gt; ({+toolCall: { id: callId, name: toolName, args: {} },+});++// Create tool config with writer for progress tracking+export const createToolConfigWithWriter = (+toolName: string,+callId: string = &#39;test-call&#39;,+): ToolRunnableConfig &amp; LangGraphRunnableConfig &amp; { writer: jest.Mock } =&gt; {+const mockWriter = jest.fn();+return {+toolCall: { id: callId, name: toolName, args: {} },+writer: mockWriter,+};+};++// ========== Workflow State Helpers ==========++// Setup workflow state with mockGetCurrentTaskInput+export const setupWorkflowState = (+mockGetCurrentTaskInput: jest.MockedFunction&lt;typeof getCurrentTaskInput&gt;,+workflow: SimpleWorkflow = createWorkflow([]),+) =&gt; {+mockGetCurrentTaskInput.mockReturnValue({+workflowJSON: workflow,+});+};++// ========== Common Tool Assertions ==========++// Expect tool success message+export const expectToolSuccess = (+content: ParsedToolContent,+expectedMessage: string | RegExp,+): void =&gt; {+const message = content.update.messages[0]?.kwargs.content;+expect(message).toBeDefined();+if (typeof expectedMessage === &#39;string&#39;) {+expect(message).toContain(expectedMessage);+} else {+expect(message).toMatch(expectedMessage);+}+};++// Expect tool error message+export const expectToolError = (+content: ParsedToolContent,+expectedError: string | RegExp,+): void =&gt; {+const message = content.update.messages[0]?.kwargs.content;+if (typeof expectedError === &#39;string&#39;) {+expect(message).toBe(expectedError);+} else {+expect(message).toMatch(expectedError);+}+};++// Expect workflow operation of specific type+export const expectWorkflowOperation = (+content: ParsedToolContent,+operationType: string,+matcher?: Record&lt;string, unknown&gt;,+): void =&gt; {+const operation = content.update.workflowOperations?.[0];+expect(operation).toBeDefined();+expect(operation?.type).toBe(operationType);+if (matcher) {+expect(operation).toMatchObject(matcher);+}+};++// Expect node was added+export const expectNodeAdded = (content: ParsedToolContent, expectedNode: Partial&lt;INode&gt;): void =&gt; {+expectWorkflowOperation(content, &#39;addNodes&#39;);+const addedNode = content.update.workflowOperations?.[0]?.nodes?.[0];+expect(addedNode).toBeDefined();+expect(addedNode).toMatchObject(expectedNode);+};++// Expect node was removed+export const expectNodeRemoved = (content: ParsedToolContent, nodeId: string): void =&gt; {+expectWorkflowOperation(content, &#39;removeNode&#39;, { nodeIds: [nodeId] });+};++// Expect connections were added+export const expectConnectionsAdded = (+content: ParsedToolContent,+expectedCount?: number,+): void =&gt; {+expectWorkflowOperation(content, &#39;addConnections&#39;);+if (expectedCount !== undefined) {+const connections = content.update.workflowOperations?.[0]?.connections;+expect(connections).toHaveLength(expectedCount);+}+};++// Expect node was updated+export const expectNodeUpdated = (+content: ParsedToolContent,+nodeId: string,+expectedUpdates?: Record&lt;string, unknown&gt;,+): void =&gt; {+expectWorkflowOperation(content, &#39;updateNode&#39;, {+nodeId,+...(expectedUpdates ? { updates: expect.objectContaining(expectedUpdates) } : {}),+});+};++// ========== Test Data Builders ==========++// Build add node input+export const buildAddNodeInput = (overrides: {+nodeType: string;+name?: string;+connectionParametersReasoning?: string;+connectionParameters?: Record&lt;string, unknown&gt;;+}) =&gt; ({+nodeType: overrides.nodeType,+name: overrides.name ?? &#39;Test Node&#39;,+connectionParametersReasoning:+overrides.connectionParametersReasoning ??+&#39;Standard node with static inputs/outputs, no connection parameters needed&#39;,+connectionParameters: overrides.connectionParameters ?? {},+});++// Build connect nodes input+export const buildConnectNodesInput = (overrides: {+sourceNodeId: string;+targetNodeId: string;+sourceOutputIndex?: number;+targetInputIndex?: number;+}) =&gt; ({+sourceNodeId: overrides.sourceNodeId,+targetNodeId: overrides.targetNodeId,+sourceOutputIndex: overrides.sourceOutputIndex ?? 0,+targetInputIndex: overrides.targetInputIndex ?? 0,+});++// Build node search query+export const buildNodeSearchQuery = (+queryType: &#39;name&#39; | &#39;subNodeSearch&#39;,+query?: string,+connectionType?: NodeConnectionType,+) =&gt; ({+queryType,+...(query &amp;&amp; { query }),+...(connectionType &amp;&amp; { connectionType }),+});++// Build update node parameters input+export const buildUpdateNodeInput = (nodeId: string, changes: string[]) =&gt; ({+nodeId,+changes,+});++// Build node details input+export const buildNodeDetailsInput = (overrides: {+nodeName: string;+withParameters?: boolean;+withConnections?: boolean;+}) =&gt; ({+nodeName: overrides.nodeName,+withParameters: overrides.withParameters ?? false,+withConnections: overrides.withConnections ?? true,+});++// Expect node details in response+export const expectNodeDetails = (+content: ParsedToolContent,+expectedDetails: Partial&lt;{+name: string;+displayName: string;+description: string;+subtitle?: string;+}&gt;,+): void =&gt; {+const message = content.update.messages[0]?.kwargs.content;+expect(message).toBeDefined();++// Check for expected XML-like tags in formatted output+if (expectedDetails.name) {+expect(message).toContain(`&lt;name&gt;${expectedDetails.name}&lt;/name&gt;`);+}+if (expectedDetails.displayName) {+expect(message).toContain(`&lt;display_name&gt;${expectedDetails.displayName}&lt;/display_name&gt;`);+}+if (expectedDetails.description) {+expect(message).toContain(`&lt;description&gt;${expectedDetails.description}&lt;/description&gt;`);+}+if (expectedDetails.subtitle) {+expect(message).toContain(`&lt;subtitle&gt;${expectedDetails.subtitle}&lt;/subtitle&gt;`);+}+};++// Helper to validate XML-like structure in output+export const expectXMLTag = (+content: string,+tagName: string,+expectedValue?: string | RegExp,+): void =&gt; {+const tagRegex = new RegExp(`&lt;${tagName}&gt;([\\s\\S]*?)&lt;/${tagName}&gt;`);+const match = content.match(tagRegex);+expect(match).toBeDefined();+if (expectedValue) {+if (typeof expectedValue === &#39;string&#39;) {+expect(match?.[1]?.trim()).toBe(expectedValue);+} else {+expect(match?.[1]).toMatch(expectedValue);+}+}+};++// Common reasoning strings+export const REASONING = {+STATIC_NODE: &#39;Node has static inputs/outputs, no connection parameters needed&#39;,+DYNAMIC_AI_NODE: &#39;AI node has dynamic inputs, setting connection parameters&#39;,+TRIGGER_NODE: &#39;Trigger node, no connection parameters needed&#39;,+WEBHOOK_NODE: &#39;Webhook is a trigger node, no connection parameters needed&#39;,+} as const;</file context>

// Start with a copy of the current workflow
letresult:SimpleWorkflow={
nodes:[...workflow.nodes],
connections:{ ...workflow.connections},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Shallow–copying only the top-levelconnections object means nested arrays/objects are still shared with the original workflow. Subsequent mutations insidemergeConnections mutate both the new and the original workflow, breaking immutability assumptions and causing hard-to-trace state bugs.

Prompt for AI agents
Address the following comment on packages/@n8n/ai-workflow-builder.ee/src/utils/operations-processor.ts at line 17:<comment>Shallow–copying only the top-level `connections` object means nested arrays/objects are still shared with the original workflow. Subsequent mutations inside `mergeConnections` mutate both the new and the original workflow, breaking immutability assumptions and causing hard-to-trace state bugs.</comment><file context>@@ -0,0 +1,162 @@+import type { INode, IConnections } from &#39;n8n-workflow&#39;;++import type { SimpleWorkflow, WorkflowOperation } from &#39;../types/workflow&#39;;+import type { WorkflowState } from &#39;../workflow-state&#39;;++/**+ * Apply a list of operations to a workflow+ */+// eslint-disable-next-line complexity+export function applyOperations(+workflow: SimpleWorkflow,+operations: WorkflowOperation[],+): SimpleWorkflow {+// Start with a copy of the current workflow+let result: SimpleWorkflow = {+nodes: [...workflow.nodes],+connections: { ...workflow.connections },+};++// Apply each operation in sequence+for (const operation of operations) {+switch (operation.type) {+case &#39;clear&#39;:+result = { nodes: [], connections: {} };+break;++case &#39;removeNode&#39;: {+const nodesToRemove = new Set(operation.nodeIds);++// Filter out removed nodes+result.nodes = result.nodes.filter((node) =&gt; !nodesToRemove.has(node.id));++// Clean up connections+const cleanedConnections: IConnections = {};++// Copy connections, excluding those from/to removed nodes+for (const [sourceId, nodeConnections] of Object.entries(result.connections)) {+if (!nodesToRemove.has(sourceId)) {+cleanedConnections[sourceId] = {};++for (const [connectionType, outputs] of Object.entries(nodeConnections)) {+if (Array.isArray(outputs)) {+cleanedConnections[sourceId][connectionType] = outputs.map((outputConnections) =&gt; {+if (Array.isArray(outputConnections)) {+return outputConnections.filter((conn) =&gt; !nodesToRemove.has(conn.node));+}+return outputConnections;+});+}+}+}+}++result.connections = cleanedConnections;+break;+}++case &#39;addNodes&#39;: {+// Create a map for quick lookup+const nodeMap = new Map&lt;string, INode&gt;();+result.nodes.forEach((node) =&gt; nodeMap.set(node.id, node));++// Add or update nodes+operation.nodes.forEach((node) =&gt; {+nodeMap.set(node.id, node);+});++result.nodes = Array.from(nodeMap.values());+break;+}++case &#39;updateNode&#39;: {+result.nodes = result.nodes.map((node) =&gt; {+if (node.id === operation.nodeId) {+return { ...node, ...operation.updates };+}+return node;+});+break;+}++case &#39;setConnections&#39;: {+// Replace connections entirely+result.connections = operation.connections;+break;+}++case &#39;mergeConnections&#39;: {+// Merge connections additively+for (const [sourceId, nodeConnections] of Object.entries(operation.connections)) {+if (!result.connections[sourceId]) {+result.connections[sourceId] = nodeConnections;+} else {+// Merge connections for this source node+for (const [connectionType, newOutputs] of Object.entries(nodeConnections)) {+if (!result.connections[sourceId][connectionType]) {+result.connections[sourceId][connectionType] = newOutputs;+} else {+// Merge arrays of connections+const existingOutputs = result.connections[sourceId][connectionType];++if (Array.isArray(newOutputs) &amp;&amp; Array.isArray(existingOutputs)) {+// Merge each output index+for (let i = 0; i &lt; Math.max(newOutputs.length, existingOutputs.length); i++) {+if (!newOutputs[i]) continue;++if (!existingOutputs[i]) {+existingOutputs[i] = newOutputs[i];+} else if (Array.isArray(newOutputs[i]) &amp;&amp; Array.isArray(existingOutputs[i])) {+// Merge connections at this output index, avoiding duplicates+const existingSet = new Set(+existingOutputs[i]!.map((conn) =&gt;+JSON.stringify({ node: conn.node, type: conn.type, index: conn.index }),+),+);++newOutputs[i]!.forEach((conn) =&gt; {+const connStr = JSON.stringify({+node: conn.node,+type: conn.type,+index: conn.index,+});+if (!existingSet.has(connStr)) {+existingOutputs[i]!.push(conn);+}+});+}+}+}+}+}+}+}+break;+}+}+}++return result;+}++/**+ * Process operations node for the LangGraph workflow+ * This node applies accumulated operations to the workflow state+ */+export function processOperations(state: typeof WorkflowState.State) {+const { workflowJSON, workflowOperations } = state;++// If no operations to process, return unchanged+if (!workflowOperations || workflowOperations.length === 0) {+return {};+}++// Apply all operations to get the new workflow+const newWorkflow = applyOperations(workflowJSON, workflowOperations);++// Return updated state with cleared operations+return {+workflowJSON: newWorkflow,+workflowOperations: null, // Clear processed operations+};+}</file context>
Suggested change
connections:{ ...workflow.connections},
connections:structuredClone(workflow.connections),

Copy link
Contributor

@cubic-dev-aicubic-dev-aibot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

cubic found 14 issues across 77 files. Review them incubic.dev

React with 👍 or 👎 to teach cubic. Tag@cubic-dev-ai to give specific feedback.

// Start with a copy of the current workflow
letresult:SimpleWorkflow={
nodes:[...workflow.nodes],
connections:{ ...workflow.connections},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Shallow–copying only the top-levelconnections object means nested arrays/objects are still shared with the original workflow. Subsequent mutations insidemergeConnections mutate both the new and the original workflow, breaking immutability assumptions and causing hard-to-trace state bugs.

Prompt for AI agents
Address the following comment on packages/@n8n/ai-workflow-builder.ee/src/utils/operations-processor.ts at line 17:<comment>Shallow–copying only the top-level `connections` object means nested arrays/objects are still shared with the original workflow. Subsequent mutations inside `mergeConnections` mutate both the new and the original workflow, breaking immutability assumptions and causing hard-to-trace state bugs.</comment><file context>@@ -0,0 +1,162 @@+import type { INode, IConnections } from &#39;n8n-workflow&#39;;++import type { SimpleWorkflow, WorkflowOperation } from &#39;../types/workflow&#39;;+import type { WorkflowState } from &#39;../workflow-state&#39;;++/**+ * Apply a list of operations to a workflow+ */+// eslint-disable-next-line complexity+export function applyOperations(+workflow: SimpleWorkflow,+operations: WorkflowOperation[],+): SimpleWorkflow {+// Start with a copy of the current workflow+let result: SimpleWorkflow = {+nodes: [...workflow.nodes],+connections: { ...workflow.connections },+};++// Apply each operation in sequence+for (const operation of operations) {+switch (operation.type) {+case &#39;clear&#39;:+result = { nodes: [], connections: {} };+break;++case &#39;removeNode&#39;: {+const nodesToRemove = new Set(operation.nodeIds);++// Filter out removed nodes+result.nodes = result.nodes.filter((node) =&gt; !nodesToRemove.has(node.id));++// Clean up connections+const cleanedConnections: IConnections = {};++// Copy connections, excluding those from/to removed nodes+for (const [sourceId, nodeConnections] of Object.entries(result.connections)) {+if (!nodesToRemove.has(sourceId)) {+cleanedConnections[sourceId] = {};++for (const [connectionType, outputs] of Object.entries(nodeConnections)) {+if (Array.isArray(outputs)) {+cleanedConnections[sourceId][connectionType] = outputs.map((outputConnections) =&gt; {+if (Array.isArray(outputConnections)) {+return outputConnections.filter((conn) =&gt; !nodesToRemove.has(conn.node));+}+return outputConnections;+});+}+}+}+}++result.connections = cleanedConnections;+break;+}++case &#39;addNodes&#39;: {+// Create a map for quick lookup+const nodeMap = new Map&lt;string, INode&gt;();+result.nodes.forEach((node) =&gt; nodeMap.set(node.id, node));++// Add or update nodes+operation.nodes.forEach((node) =&gt; {+nodeMap.set(node.id, node);+});++result.nodes = Array.from(nodeMap.values());+break;+}++case &#39;updateNode&#39;: {+result.nodes = result.nodes.map((node) =&gt; {+if (node.id === operation.nodeId) {+return { ...node, ...operation.updates };+}+return node;+});+break;+}++case &#39;setConnections&#39;: {+// Replace connections entirely+result.connections = operation.connections;+break;+}++case &#39;mergeConnections&#39;: {+// Merge connections additively+for (const [sourceId, nodeConnections] of Object.entries(operation.connections)) {+if (!result.connections[sourceId]) {+result.connections[sourceId] = nodeConnections;+} else {+// Merge connections for this source node+for (const [connectionType, newOutputs] of Object.entries(nodeConnections)) {+if (!result.connections[sourceId][connectionType]) {+result.connections[sourceId][connectionType] = newOutputs;+} else {+// Merge arrays of connections+const existingOutputs = result.connections[sourceId][connectionType];++if (Array.isArray(newOutputs) &amp;&amp; Array.isArray(existingOutputs)) {+// Merge each output index+for (let i = 0; i &lt; Math.max(newOutputs.length, existingOutputs.length); i++) {+if (!newOutputs[i]) continue;++if (!existingOutputs[i]) {+existingOutputs[i] = newOutputs[i];+} else if (Array.isArray(newOutputs[i]) &amp;&amp; Array.isArray(existingOutputs[i])) {+// Merge connections at this output index, avoiding duplicates+const existingSet = new Set(+existingOutputs[i]!.map((conn) =&gt;+JSON.stringify({ node: conn.node, type: conn.type, index: conn.index }),+),+);++newOutputs[i]!.forEach((conn) =&gt; {+const connStr = JSON.stringify({+node: conn.node,+type: conn.type,+index: conn.index,+});+if (!existingSet.has(connStr)) {+existingOutputs[i]!.push(conn);+}+});+}+}+}+}+}+}+}+break;+}+}+}++return result;+}++/**+ * Process operations node for the LangGraph workflow+ * This node applies accumulated operations to the workflow state+ */+export function processOperations(state: typeof WorkflowState.State) {+const { workflowJSON, workflowOperations } = state;++// If no operations to process, return unchanged+if (!workflowOperations || workflowOperations.length === 0) {+return {};+}++// Apply all operations to get the new workflow+const newWorkflow = applyOperations(workflowJSON, workflowOperations);++// Return updated state with cleared operations+return {+workflowJSON: newWorkflow,+workflowOperations: null, // Clear processed operations+};+}</file context>
Suggested change
connections:{ ...workflow.connections},
connections:structuredClone(workflow.connections),

@codecovCodecov

This comment was marked as outdated.

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@cubic-dev-aicubic-dev-ai[bot]cubic-dev-ai[bot] left review comments

@burivuhsterburivuhsterAwaiting requested review from burivuhsterburivuhster was automatically assigned from n8n-io/ai

At least 1 approving review is required to merge this pull request.

Assignees
No one assigned
Labels
n8n teamAuthored by the n8n team
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

1 participant
@OlegIvaniv

[8]ページ先頭

©2009-2025 Movatter.jp