Document Object Model (DOM) capturing is the method Unblu uses (in embedded co-browsing) to recreate the visitor browser for the agent.
While this is an ideal solution for servicing visitors, without the need for any downloads on the visitor side, there are a number of technical and performance issues to consider.
Performance problems will show themselves as:
The first two problems create a sub-par user experience but the third problem is critical. After a timeout is reached the Unblu server will assume that the browser has stalled but it cannot know the 'reason' for the problem.
Power of the underlying hardware.
Complexity of the Document Object Model (DOM).
The Unblu server has a feature that will detect whether the page complexity (number of DOM nodes) exceeds a (configurable) limit. If the complexity of the DOM tree exceeds this limit the server will abort processing and a 'Page too Complex' message will display.
Unfortunately, simply defining this limit (using com.unblu.recorder.maxNodes=[configuration value]) will likely not solve the problem on its own. This value is set to 45,000 by default.
We do not recommend increasing this default value. For example, even if the complexity is lower than the default value it may still be 'too complex' and generate a warning.
|Node-processing capacity was tripled, from 15,000 to 45,000 in the move from Unblu version 4.1 to 4.2. Therefore, if you are using a version prior to 4.2 simply upgrading your installation may solve potential problems.|
Before attempting to tweak browsers and upgrade hardware you should be aware that, strictly speaking, there is only one guaranteed way to fix this problem:
While more memory, faster processors, etc., can manage this problem they are not a fix, per se. Even the fastest browser running on a fast machine can hit limits. For example, as new pages are orchestrated to run Unblu the chances increase that some may bump up against DOM complexity.
The only way to truly eliminate this limitation is to use Web paging when designing your pages. By dividing potential complexity across discrete web pages you solve any potential DOM complexity problems at source.
If, for whatever reason, you are unable to implement Web paging and, given that it is unlikely you would be able to upgrade visitor hardware, you could try to encourage your visitors to use a modern browser (e.g., Google Chrome). However, you should bear in the mind that this may or may not work, according to the complexity of the pages on your site.