How I Slashed Angular Form Rendering Time by 90% Using WeakMap Caching
How I Slashed Angular Form Rendering Time by 90% Using WeakMap Caching#
Last month, I faced a performance nightmare with our enterprise Angular application. Our form rendering was painfully slow, especially on lower-end devices. Users were complaining, management was getting anxious, and I was tasked with fixing it without rewriting the entire codebase.
Today, I'm sharing how I solved this with some clever TypeScript caching. No frameworks, no libraries—just smart code organization and strategic caching.
I'll admit, I was skeptical at first about whether such a simple concept could make a big difference. But after seeing our form render times drop by 90%, I'm convinced that strategic caching is one of the most underutilized performance tools in the Angular developer's toolkit.
The Performance Killer: Our Dynamic Forms System#
Our app has these complex performance-agreement forms with different "process types" (OKR, SMART, Competency) that completely change how fields behave and look. Each field can:
- Change its layout (column/row span) based on process type
- Show/hide based on complex business rules
- Have dynamic validation rules
- Adapt its appearance based on view mode
The killer issue? Every time Angular ran change detection (which happens A LOT), we were recalculating these properties over and over. For a form with 20+ fields, this meant hundreds of redundant calculations per render cycle.
My initial time measurement showed that 80% of our render time was spent on these calculations. Not good.
The Solution I Came Up With: WeakMap Caching#
After digging through the code, I realized we needed a caching system that would:
- Cache results based on the specific field and its context
- Not cause memory leaks
- Be simple enough that my team could maintain it
- Work with our existing code without massive refactoring
My solution centered around TypeScript's WeakMap
, which I'd never used in a real project before. Here's the approach:
Why WeakMap Is Perfect for This#
I chose WeakMap
specifically because:
- It uses object references as keys (perfect for our field objects)
- It automatically handles garbage collection - when a field object is no longer referenced elsewhere, its cache automatically gets cleaned up
- It's fast - O(1) lookup time
This approach meant zero memory management headaches. We never need to worry about clearing caches manually.
The Real-World Results#
After implementing this across our field property systems, the results were dramatic:
- Form rendering time dropped by ~90%
- Scrolling became buttery smooth even on complex forms
- CPU usage during form interaction plummeted
- No more "Angular is running in development mode" warnings about excessive change detection
The best part? We didn't need to change our component code or templates at all. The caching happens entirely in the utility functions, making it a mostly transparent optimization.
Digging Into the Implementation Details#
The Double-Map Approach#
One of the most interesting aspects of this solution is what I call the "double-map" approach. We don't just use a WeakMap - we use a WeakMap of Maps:
1// This is what's really going on:
2WeakMap<FieldConfig, Map<string, ResultType>>;
This two-level structure gives us incredible flexibility:
- The outer WeakMap uses the field object as a key
- The inner Map uses a string key representing the specific context (like process types and entity state)
This approach lets us cache multiple different results for the same field based on different contexts. For example, the same field might be visible with OKR process type but hidden with SMART process type.
The Cache Key Challenge#
The trickiest part was designing proper cache keys. We needed keys that:
- Uniquely identified the calculation context
- Were consistent across calls with the same input values
- Weren't unnecessarily complex (to avoid string parsing overhead)
For entity-based calculations, we used the entity ID as the basis:
1function createEntityCacheKey(entity, additionalContext) {
2 const entityId = entity?.id || "";
3 return additionalContext ? `${entityId}_${additionalContext}` : entityId;
4}
For process-type based calculations, we needed to ensure consistent ordering:
1function createProcessTypeKey(processTypes) {
2 // Sort to ensure consistent key regardless of array order
3 return processTypes.slice().sort().join(",");
4}
This became crucial for complex cases where we combined both:
1// For visibility that depends on both entity state AND process types
2const typesKey = createProcessTypeKey(processTypes);
3const cacheKey = createEntityCacheKey(entity, typesKey);
The Validator Function Challenge#
One particularly complex case was validation. Angular form validators don't naturally lend themselves to caching, since they're functions rather than plain values.
The challenge was that each field could have multiple different validation rules, and those rules could reference entity values. We solved this by creating a custom validation function factory:
1function createCachedValidator(rule, entity) {
2 return (control) => {
3 // For validators, we cache based on the control value as well
4 const value = control.value;
5 const valueKey =
6 typeof value === "object" ? JSON.stringify(value) : String(value);
7 const cacheKey = createEntityCacheKey(entity, valueKey);
8
9 // Use our generic caching pattern
10 return getCachedValue(rule, validationResultCache, cacheKey, () =>
11 rule.validator(value, entity) ? null : { [rule.errorKey]: true }
12 );
13 };
14}
This approach gave us validation that was both fast and properly integrated with Angular's form system.
Beyond Simple Properties: Layout Calculations#
The most dramatic improvements came from caching field layout calculations. In our system, fields could specify their layout in several ways:
- Default layout (simple column/row span)
- Process-type specific layouts
- Custom calculation functions based on complex logic
Before caching, template bindings like this were causing major performance issues:
1<f-grid-item
2 [colSpan]="getFieldColSpan(field, processTypes)"
3 [rowSpan]="getFieldRowSpan(field, processTypes)"
4></f-grid-item>
These calculations could involve multiple conditional checks, lookups, and function calls - all happening twice per field, every render cycle.
Our caching solution consolidated these into a single cached function:
Then we updated our component with a simpler method:
1getFieldDisplayValue(field, property) {
2 const props = getDisplayProperties(field, this.processTypes);
3 return props[property] || (property === 'colSpan' ? 2 : 1);
4}
And our template became:
1<f-grid-item
2 [colSpan]="getFieldDisplayValue(field, 'colSpan')"
3 [rowSpan]="getFieldDisplayValue(field, 'rowSpan')"
4></f-grid-item>
This change alone cut our rendering time in half because each field's display properties were now calculated exactly once per render cycle, regardless of how many properties we needed.
Refactoring Our Service Layer#
With the core caching mechanism in place, we refactored our services to use a facade pattern. This kept the API clean while leveraging our optimizations:
This gave us the best of both worlds: clean service APIs with optimized implementations.
Unexpected Benefits: Better Error Detection#
An unexpected benefit of our caching approach was better error detection. By centralizing these calculations, we were able to add better error handling and debugging.
For example, in our validation function:
This improved error handling made debugging much easier and prevented cryptic UI errors.
Lessons I Learned#
This experience taught me several valuable lessons:
-
Measure before optimizing - My initial assumptions about what was causing slowness were wrong. Profiling showed the real culprits.
-
WeakMap is underused in Angular - It's perfect for component-level caching where you need to associate data with objects that have lifecycle.
-
TypeScript namespaces are still useful - Despite being considered "old-school" by some, they're great for organizing utility functions.
-
Caching isn't just for HTTP calls - Calculation caching can have just as big an impact as network caching.
-
Performance optimization doesn't have to be ugly - This solution actually made our code cleaner and more maintainable.
The Code Pattern I Use Now#
I've since standardized this pattern in our codebase:
This has been a game-changer for us. Whenever we have expensive calculations that run repeatedly with the same inputs, we wrap them with this pattern.
Angular's Change Detection and Our Optimizations#
One aspect I haven't discussed yet is how this interacts with Angular's change detection. Angular's default change detection strategy runs checks on all components after any event or async operation. For complex forms, this means a lot of recalculation.
We experimented with OnPush change detection, but that wasn't enough on its own. The problem was that our calculations were still being triggered during every check cycle because template bindings were calling functions.
Our caching solution dramatically reduced the work done during these check cycles. Even though the functions were still being called, they were now returning cached values instead of recalculating.
We did find one extra optimization though - Angular's binding system doesn't cache the results of method calls (like [colSpan]="getColSpan(field)"
), but it does cache property access. This led us to experiment with a memoization decorator for component methods:
Then we could use it like this:
1@Memoize
2getFieldDisplayValue(field, property) {
3 const props = FieldUtils.getDisplayProperties(field, this.processTypes);
4 return props[property] || (property === 'colSpan' ? 2 : 1);
5}
This added an extra layer of caching at the component level, making the binding system more efficient. However, we had to be careful with this approach, as it could lead to stale values if not properly managed.
Real-World Trade-offs#
While the caching approach was generally successful, we did face some trade-offs:
-
Debugging complexity: Cached values sometimes made debugging more difficult since the actual calculation might have happened much earlier than when an issue was observed.
-
Cache invalidation concerns: For some fields, we needed to explicitly clear caches when external state changed in ways our cache keys didn't capture.
-
Memory usage: While WeakMap helped with garbage collection, our approach did increase memory usage somewhat. The performance benefits far outweighed this cost.
-
TypeScript complexity: The typing for our caching system was nontrivial, especially with the nested maps and generic types.
Overall, these trade-offs were well worth it for the performance gains we achieved.
Would I Recommend This Approach?#
Absolutely, but with a few caveats:
- Make sure you actually have a performance problem first
- Only cache calculations that are genuinely expensive
- Be careful with your cache keys to avoid incorrect cached values
- Use TypeScript's strong typing to keep the caching system maintainable
- Consider the memory implications, especially for very large forms
- Add proper cache debugging tools if your app is complex