For this post, I thought I would talk about SPAR 2012, which I attended in Houston last week.
For those of you where are not familiar with it, SPAR is a conference/trade show for the medium and long-range scanning industries. From a geometry perspective, this means dealing with point clouds. Lots of point clouds. Lots of really big point clouds. For example, at one of the talks I attended, the speaker was discussing dealing with thousands of terabytes of data. Another speaker discussed the fact that developing the systems just to manage and archive the huge amount of data being produced is going to be a major challenge, let alone the need for processing it all.
As an example of this, the very first thing I noticed when I walked into the exhibit hall was the huge number of companies selling mobile terrestrial scanners. These are laser scanning units that you strap onto the back of a van, or an SUV, or an ultra-light airplane, or a UAV - there was even a lidar-equipped Cooper Mini on display. You then drive (or fly) along the path you want to scan, acquiring huge amounts of data. The data is then processed to tell you about e.g. potholes or highway signs or lane lines on roads (the scanners are often coupled with photographic systems) or vegetation incursions on power lines (typically from aerial scans). When I attended two years ago, this was a fairly specialized industry; there were only a few vans on display, the companies that made the hardware tended to be the ones doing the scans, and they also wrote the software to display and interpret the data.
This year, it seemed like this sector had commoditized: there were at least eight vehicles on display, the starting price of scanning units had come down to about $100K, and it seemed that there were vendors everywhere you looked on the display floor (and yes, it does sound odd to me that I’m calling a $100K price point “commoditized”). Another thing that I was looking for, and think I saw, was a bifurcation into hardware and software vendors. I asked several of the hardware vendors about analysis; this year they uniformly told me that they spat their data out into one of several formats that could be read by standard software packages. I view this specialization as a sign of growing maturity in the scanning industry; it shows that it is moving past the pioneer days when a hardware manufacturer has to do it all.
On the software side, I saw a LOT of fitting pipes to point clouds. This is because a large part of the medium range scanning market (at least as represented at SPAR) is capturing “as built” models of oil platforms and chemical plants, especially the piping. The workflow is to spend a few days scanning the facility, and then send the data to a contractor who spends a few months building a CAD model of the piping, from which renovation work on the facility can be planned. One of the sub-themes that ran through many of the talks at the conference was “be careful of your data – even though the scanner says it’s accurate to 1mm, you’re probably only getting ½ inch of accuracy”. This was driven home to us at Spatial a few years ago when we bought a low-end scanner to play around with and we discovered that a sharp black/white transition caused a “tear” in the surface mesh spit out from the scanner (due to differential systematic errors between white and black). A practical example of this was discussed in a talk by one of the service providers; he gave a case study of a company that tried to refit a plant using the workflow described above. Early on they discovered that the purported “as built” model (obtained by humans building models from scanned data) weren’t accurate enough to do the work – new piping that should fit correctly from the model wouldn’t fit in reality (for example all the small-diameter pipes had been left out of the model completely). This is because a real-world plant isn’t a bunch of clean vertical and horizontal cylinders; pipes sag, they’re stressed (bent) to make them fit pieces of equipment and so on. The company went back and had the job re-done, this time keeping a close tie between the original scans and the model at all stages. I really appreciated the attention to detail shown in this and other talks; in my opinion it’s just good engineering to understand and control for the systematic errors that are introduced at every stage of the process.
Two more quick observations:
- Several people mentioned the MS Kinect scanner (for the gaming console) as disruptive technology. My gut is telling me that there’s a good chance that this will truly commoditize the scanning world, and that photogrammetry might take over from laser scanning.
- I didn’t expect my former life as a particle physicist to be relevant at a scanning conference. Imagine my surprise when I saw not one but TWO pictures of particle accelerators show up in talks (and one of them a plenary session!)
Next year’s SPAR conference is in Colorado Springs – I hope to see you there!