(Hint: it’s not Comcast)
While 3D scanning technology is becoming more affordable and easy to use, there is still a huge gap between creating a usable 3D point cloud and gathering sufficient data to plan an industrial project, engineer equipment or execute a simulation. Commercial as-built scanning for basic construction or surveying may include a few dozen scans that need to be properly set up and arranged. Each angle of the scanning area should be accounted for, along with any ‘deadspace’ caused by blocked structures. 3D scanning in an industrial setting is much more complex than commercial scanning. Compared to common commercial 3D scanning requirements, everything is an order of magnitude (or two) bigger as well as significantly more complicated. Instead of a few dozen or even a hundred scans, thorough industrial scanning often involves many thousands of scans per project. One main reason for this is that industrial scanning environments typically include a large number of obstructions, creating dead space in the scan that requires another scan to fill in.
Industrial facilities themselves may stretch over many acres of ground and require significant movement to capture. This means scanners must be significantly more experienced and skilled to ensure that they capture everything and take scans properly to allow the many point clouds that were scanned to be stitched together accurately. A single manufacturing line in a factory can require over 1000 scans to capture completely. In addition, these scans often need to be taken while the facility is operating. No plant can shut down while your team makes 8,000 3D scans.
Modern 3D scanners store 100-200 Megabytes (MB) of raw data per scan. Once the data is registered and indexed, each scan may grow to over 1 gigabit (GB) of data. Individually, large projects could range from a few TB to 20TB of data, or more! If a client needs a copy of ALL of their project data, it could easily comprise dozens if not hundreds of TB of storage. This poses an interesting problem: how do we move this much data from place to place quickly and efficiently?
Despite the ‘broadband revolution’ bringing high-speed video streaming into our homes and devices, trying to transfer files in high-data applications is really no easier (relatively) than they were a dozen years ago because datasets are growing faster than bandwidth is. Even high-speed business connections bandwidth doesn’t keep pace with advancements that expand the ability to collect and store data in cutting-edge hardware and software applications. Even if you could use the entirety of your available internet connection, blocking all other traffic until the job is done, and your recipient can do the same – transferring multiple TB of data is quite impractical. Looking at the chart below, common high-speed connections for most small and midsize businesses are insufficient for large datasets:
Having 10TB of data to transfer is not outlandish in today’s data-driven world. Right now it is faster to ‘next-day air’ your data with a shipping company if the repository is 10TB or larger. If the available connections are closer to 100Mbps, it may be necessary to ship the data for even smaller amounts. With any normal small or midsize business internet connections, available speeds less than 1Gbps are typically what is available. For datasets smaller than 10TB, with exceptional, high-bandwidth connections at both ends of a transfer, reasonable file exchanges can occur in just a few hours, if, and only if, you can monopolize that connection during the transfer. For datasets larger than 10TB, a connection significantly faster than 1Gbps on both sides would be needed to transfer files efficiently in a few hours (a 4Gbps dedicated connection on both ends of the transfer can complete 10TB in 5.3 hours).
As technology advances, datasets will continue to rapidly increase in size and connection bandwidth simply has not been able to keep up. Without a significant advancement in network technology, this situation has no real end in sight, in fact, the problem may accelerate. The growing IIoT and Augmented reality industries are generating more industrial data than ever before. In the next decade of industrial technology, efficient data transfer is quickly becoming a major pain point.