How I Slashed Angular Form Rendering Time by 90% Using WeakMap Caching

How I Slashed Angular Form Rendering Time by 90% Using WeakMap Caching#

Last month, I faced a performance nightmare with our enterprise Angular application. Our form rendering was painfully slow, especially on lower-end devices. Users were complaining, management was getting anxious, and I was tasked with fixing it without rewriting the entire codebase.

Today, I'm sharing how I solved this with some clever TypeScript caching. No frameworks, no libraries—just smart code organization and strategic caching.

I'll admit, I was skeptical at first about whether such a simple concept could make a big difference. But after seeing our form render times drop by 90%, I'm convinced that strategic caching is one of the most underutilized performance tools in the Angular developer's toolkit.

The Performance Killer: Our Dynamic Forms System#

Our app has these complex performance-agreement forms with different "process types" (OKR, SMART, Competency) that completely change how fields behave and look. Each field can:

  • Change its layout (column/row span) based on process type
  • Show/hide based on complex business rules
  • Have dynamic validation rules
  • Adapt its appearance based on view mode

The killer issue? Every time Angular ran change detection (which happens A LOT), we were recalculating these properties over and over. For a form with 20+ fields, this meant hundreds of redundant calculations per render cycle.

My initial time measurement showed that 80% of our render time was spent on these calculations. Not good.

The Solution I Came Up With: WeakMap Caching#

After digging through the code, I realized we needed a caching system that would:

  1. Cache results based on the specific field and its context
  2. Not cause memory leaks
  3. Be simple enough that my team could maintain it
  4. Work with our existing code without massive refactoring

My solution centered around TypeScript's WeakMap, which I'd never used in a real project before. Here's the approach:

typescript40 lines
1// I created a namespace to keep everything organized
2namespace FieldUtils {
3  // Different caches for different property types
4  const caches = {
5    displayProperties: new WeakMap(),
6    editability: new WeakMap(),
7    requirement: new WeakMap(),
8    validation: new WeakMap(),
9    visibility: new WeakMap()
10  };
11
12  // I made this helper to generate consistent cache keys
13  function createCacheKey(processTypes, entity) {
14    return `${entity?.id || ''}_${processTypes.sort().join(',')}`;
15  }
16
17  // Here's how one of the caching functions works
18  function isVisible(field, processTypes, entity) {
19    // Bail early if inputs are invalid
20    if (!field || !processTypes.length) return false;
21
22    // Create key and get the field's cache
23    const cacheKey = createCacheKey(processTypes, entity);
24
25    if (!caches.visibility.has(field)) {
26      caches.visibility.set(field, new Map());
27    }
28    const fieldCache = caches.visibility.get(field);
29
30    // Return cached value if we have it
31    if (fieldCache.has(cacheKey)) {
32      return fieldCache.get(cacheKey);
33    }
34
35    // Otherwise calculate, cache, and return
36    let result = /* complex visibility calculation */;
37    fieldCache.set(cacheKey, result);
38    return result;
39  }
40}

Why WeakMap Is Perfect for This#

I chose WeakMap specifically because:

  1. It uses object references as keys (perfect for our field objects)
  2. It automatically handles garbage collection - when a field object is no longer referenced elsewhere, its cache automatically gets cleaned up
  3. It's fast - O(1) lookup time

This approach meant zero memory management headaches. We never need to worry about clearing caches manually.

The Real-World Results#

After implementing this across our field property systems, the results were dramatic:

  • Form rendering time dropped by ~90%
  • Scrolling became buttery smooth even on complex forms
  • CPU usage during form interaction plummeted
  • No more "Angular is running in development mode" warnings about excessive change detection

The best part? We didn't need to change our component code or templates at all. The caching happens entirely in the utility functions, making it a mostly transparent optimization.

Digging Into the Implementation Details#

The Double-Map Approach#

One of the most interesting aspects of this solution is what I call the "double-map" approach. We don't just use a WeakMap - we use a WeakMap of Maps:

typescript2 lines
1// This is what's really going on:
2WeakMap<FieldConfig, Map<string, ResultType>>;

This two-level structure gives us incredible flexibility:

  • The outer WeakMap uses the field object as a key
  • The inner Map uses a string key representing the specific context (like process types and entity state)

This approach lets us cache multiple different results for the same field based on different contexts. For example, the same field might be visible with OKR process type but hidden with SMART process type.

The Cache Key Challenge#

The trickiest part was designing proper cache keys. We needed keys that:

  1. Uniquely identified the calculation context
  2. Were consistent across calls with the same input values
  3. Weren't unnecessarily complex (to avoid string parsing overhead)

For entity-based calculations, we used the entity ID as the basis:

typescript4 lines
1function createEntityCacheKey(entity, additionalContext) {
2  const entityId = entity?.id || "";
3  return additionalContext ? `${entityId}_${additionalContext}` : entityId;
4}

For process-type based calculations, we needed to ensure consistent ordering:

typescript4 lines
1function createProcessTypeKey(processTypes) {
2  // Sort to ensure consistent key regardless of array order
3  return processTypes.slice().sort().join(",");
4}

This became crucial for complex cases where we combined both:

typescript3 lines
1// For visibility that depends on both entity state AND process types
2const typesKey = createProcessTypeKey(processTypes);
3const cacheKey = createEntityCacheKey(entity, typesKey);

The Validator Function Challenge#

One particularly complex case was validation. Angular form validators don't naturally lend themselves to caching, since they're functions rather than plain values.

The challenge was that each field could have multiple different validation rules, and those rules could reference entity values. We solved this by creating a custom validation function factory:

typescript14 lines
1function createCachedValidator(rule, entity) {
2  return (control) => {
3    // For validators, we cache based on the control value as well
4    const value = control.value;
5    const valueKey =
6      typeof value === "object" ? JSON.stringify(value) : String(value);
7    const cacheKey = createEntityCacheKey(entity, valueKey);
8
9    // Use our generic caching pattern
10    return getCachedValue(rule, validationResultCache, cacheKey, () =>
11      rule.validator(value, entity) ? null : { [rule.errorKey]: true }
12    );
13  };
14}

This approach gave us validation that was both fast and properly integrated with Angular's form system.

Beyond Simple Properties: Layout Calculations#

The most dramatic improvements came from caching field layout calculations. In our system, fields could specify their layout in several ways:

  1. Default layout (simple column/row span)
  2. Process-type specific layouts
  3. Custom calculation functions based on complex logic

Before caching, template bindings like this were causing major performance issues:

html4 lines
1<f-grid-item
2  [colSpan]="getFieldColSpan(field, processTypes)"
3  [rowSpan]="getFieldRowSpan(field, processTypes)"
4></f-grid-item>

These calculations could involve multiple conditional checks, lookups, and function calls - all happening twice per field, every render cycle.

Our caching solution consolidated these into a single cached function:

typescript41 lines
1function getDisplayProperties(field, processTypes) {
2  // Cache key based on process types
3  const cacheKey = createProcessTypeKey(processTypes);
4
5  // Get or create the field's cache
6  if (!displayPropsCache.has(field)) {
7    displayPropsCache.set(field, new Map());
8  }
9  const fieldCache = displayPropsCache.get(field);
10
11  // Return cached result if available
12  if (fieldCache.has(cacheKey)) {
13    return fieldCache.get(cacheKey);
14  }
15
16  // Otherwise calculate the layout
17  let result = { colSpan: 2, rowSpan: 1 }; // Default layout
18
19  // Try the field's custom function first
20  if (field.getDisplayProperties) {
21    result = field.getDisplayProperties(processTypes);
22  }
23  // Then check process-type specific configurations
24  else if (field.processTypeDisplayProperties) {
25    for (const processType of processTypes) {
26      const typeProps = field.processTypeDisplayProperties[processType];
27      if (typeProps) {
28        result = typeProps;
29        break;
30      }
31    }
32  }
33  // Fall back to default display properties
34  else if (field.displayProperties) {
35    result = field.displayProperties;
36  }
37
38  // Cache and return
39  fieldCache.set(cacheKey, result);
40  return result;
41}

Then we updated our component with a simpler method:

typescript4 lines
1getFieldDisplayValue(field, property) {
2  const props = getDisplayProperties(field, this.processTypes);
3  return props[property] || (property === 'colSpan' ? 2 : 1);
4}

And our template became:

html4 lines
1<f-grid-item
2  [colSpan]="getFieldDisplayValue(field, 'colSpan')"
3  [rowSpan]="getFieldDisplayValue(field, 'rowSpan')"
4></f-grid-item>

This change alone cut our rendering time in half because each field's display properties were now calculated exactly once per render cycle, regardless of how many properties we needed.

Refactoring Our Service Layer#

With the core caching mechanism in place, we refactored our services to use a facade pattern. This kept the API clean while leveraging our optimizations:

typescript16 lines
1@Injectable({ providedIn: "root" })
2export class FieldConfigService {
3  // Original methods now delegates to cached implementations
4  isFieldEditable(field, entity) {
5    return FieldUtils.isEditable(field, entity);
6  }
7
8  isFieldRequired(field, entity) {
9    return FieldUtils.isRequired(field, entity);
10  }
11
12  // New methods for the expanded functionality
13  getFieldDisplayProperties(field, processTypes) {
14    return FieldUtils.getDisplayProperties(field, processTypes);
15  }
16}

This gave us the best of both worlds: clean service APIs with optimized implementations.

Unexpected Benefits: Better Error Detection#

An unexpected benefit of our caching approach was better error detection. By centralizing these calculations, we were able to add better error handling and debugging.

For example, in our validation function:

typescript27 lines
1function validate(field, value, entity) {
2  // ... caching logic ...
3
4  try {
5    // Now we could add proper error boundaries around the validation
6    if (field.validationRules) {
7      field.validationRules.forEach((rule) => {
8        try {
9          if (!rule.validator(value, entity)) {
10            errors.push(rule.errorKey);
11          }
12        } catch (error) {
13          console.error(
14            `Error validating field ${field.key} with rule ${rule.errorKey}:`,
15            error
16          );
17          errors.push("internal.validationError");
18        }
19      });
20    }
21  } catch (error) {
22    console.error(`Catastrophic error validating field ${field.key}:`, error);
23    return ["internal.criticalError"];
24  }
25
26  // ... more logic ...
27}

This improved error handling made debugging much easier and prevented cryptic UI errors.

Lessons I Learned#

This experience taught me several valuable lessons:

  1. Measure before optimizing - My initial assumptions about what was causing slowness were wrong. Profiling showed the real culprits.

  2. WeakMap is underused in Angular - It's perfect for component-level caching where you need to associate data with objects that have lifecycle.

  3. TypeScript namespaces are still useful - Despite being considered "old-school" by some, they're great for organizing utility functions.

  4. Caching isn't just for HTTP calls - Calculation caching can have just as big an impact as network caching.

  5. Performance optimization doesn't have to be ugly - This solution actually made our code cleaner and more maintainable.

The Code Pattern I Use Now#

I've since standardized this pattern in our codebase:

typescript23 lines
1function getCachedValue<T>(
2  object: any, // The object to associate with
3  cache: WeakMap<any, Map<string, T>>, // The WeakMap cache to use
4  cacheKey: string, // String key for this specific context
5  calculator: () => T // Function to run on cache miss
6): T {
7  // Initialize cache for this object if needed
8  if (!cache.has(object)) {
9    cache.set(object, new Map<string, T>());
10  }
11
12  const objectCache = cache.get(object);
13
14  // Return cached value if it exists
15  if (objectCache.has(cacheKey)) {
16    return objectCache.get(cacheKey);
17  }
18
19  // Calculate, cache and return
20  const value = calculator();
21  objectCache.set(cacheKey, value);
22  return value;
23}

This has been a game-changer for us. Whenever we have expensive calculations that run repeatedly with the same inputs, we wrap them with this pattern.

Angular's Change Detection and Our Optimizations#

One aspect I haven't discussed yet is how this interacts with Angular's change detection. Angular's default change detection strategy runs checks on all components after any event or async operation. For complex forms, this means a lot of recalculation.

We experimented with OnPush change detection, but that wasn't enough on its own. The problem was that our calculations were still being triggered during every check cycle because template bindings were calling functions.

Our caching solution dramatically reduced the work done during these check cycles. Even though the functions were still being called, they were now returning cached values instead of recalculating.

We did find one extra optimization though - Angular's binding system doesn't cache the results of method calls (like [colSpan]="getColSpan(field)"), but it does cache property access. This led us to experiment with a memoization decorator for component methods:

typescript23 lines
1function Memoize(
2  target: any,
3  propertyKey: string,
4  descriptor: PropertyDescriptor
5) {
6  const originalMethod = descriptor.value;
7  const cacheKey = Symbol("memoizeCache");
8
9  descriptor.value = function (...args) {
10    this[cacheKey] = this[cacheKey] || new Map();
11    const key = JSON.stringify(args);
12
13    if (this[cacheKey].has(key)) {
14      return this[cacheKey].get(key);
15    }
16
17    const result = originalMethod.apply(this, args);
18    this[cacheKey].set(key, result);
19    return result;
20  };
21
22  return descriptor;
23}

Then we could use it like this:

typescript5 lines
1@Memoize
2getFieldDisplayValue(field, property) {
3  const props = FieldUtils.getDisplayProperties(field, this.processTypes);
4  return props[property] || (property === 'colSpan' ? 2 : 1);
5}

This added an extra layer of caching at the component level, making the binding system more efficient. However, we had to be careful with this approach, as it could lead to stale values if not properly managed.

Real-World Trade-offs#

While the caching approach was generally successful, we did face some trade-offs:

  1. Debugging complexity: Cached values sometimes made debugging more difficult since the actual calculation might have happened much earlier than when an issue was observed.

  2. Cache invalidation concerns: For some fields, we needed to explicitly clear caches when external state changed in ways our cache keys didn't capture.

  3. Memory usage: While WeakMap helped with garbage collection, our approach did increase memory usage somewhat. The performance benefits far outweighed this cost.

  4. TypeScript complexity: The typing for our caching system was nontrivial, especially with the nested maps and generic types.

Overall, these trade-offs were well worth it for the performance gains we achieved.

Would I Recommend This Approach?#

Absolutely, but with a few caveats:

  1. Make sure you actually have a performance problem first
  2. Only cache calculations that are genuinely expensive
  3. Be careful with your cache keys to avoid incorrect cached values
  4. Use TypeScript's strong typing to keep the caching system maintainable
  5. Consider the memory implications, especially for very large forms
  6. Add proper cache debugging tools if your app is complex
MB

Mehrshad Baqerzadegan

Sharing thoughts on technology and best practices.