Printing and Scanning

Being part of a 3D printing lab, I have access to a variety of machines for this week. The main machines I tested were

Instead of working out a complex geometry, I decided to do the opposite and focus on the simplest geometry possible, but use it in a way that is not possible (or cumbersome) with subtractive methods. The first idea that came to my mind while browsing thingiverse was to print some chainmail pattern. Some inspirations include

The geometry consists of a single atom element with connections that can be interleaved with other tiles, and then replicated to create much larger structure that have flowing properties. This can technically be done with sewing/kitting but is somewhat trivial with additive manufacturing (and can be done with more interesting materials). The only constraint was the availability of some support material. In fact, this is the main trick. Support is not sufficient, we want support that can be easily removed during post-processing.

Modeling the chainmail

I created 4 different base nodes based on a square with two arms, to test the impact of curvature on the resulting design. All of these had the same printing issue at the scale at which I worked: they all need support, and the support removal is critical.

To create the actual chainmail, I used the linear modifier using two perpendicular planes to generate interleaved components (needs a 45deg rotation from the main axes).

I then tried to print the 3rd design with lower circular arches on all printers. First, we can note that the interfaces are mostly the same in all softwares, with varying degrees of "niceness" and visual attention. The one of the wox3d was the most pleasing, and the objet one the most effective (although very restricting) for manipulation.

3D printing the chainmail

I printed initially a small chainmail (3x3). The first wox print failed with PLA because the printer was in a weird state. After rebooting, it worked fine and the result looked reasonable, but the support could unfortunately not be separated without breaking the chain elements.

I then printed on the uPrint with support. This printer is much more stable. I was able to pause and resume the print to remove some wrong parts that I had not seen at the beginning and everything still went fine. It has a nice way to clean the two different nozzles automatically. While the print was usable, removing the support was quite long. The TAs suggested 1/3 of the time of the print in the bath, but it took way longer (overnight) to really be removed.

The replicator prints all failed with the lack of support. Knowing the instability of it, I didn't expect anything, so this was no problem. Fortunately, those prints were much faster (~30min each attempt).

The objet prints all went fine given the extra support material and its voxel-based representation. I then printed the multiple examples on that machine. On the result plate, they all look the same because they are covered with support material (quite some waste!).

The support removal is unfortunately not automatic and requires to use a strong water jet to remove it manually. This is easily done but requires quite some time and can be dangerous for the small components that break easily.

After manipulating these, I realized that the curvature helps a lot to unfold and my rectangular designs were troublesome and easily locked themselves in bad positions. The best design was the one with low circular arches which I had chosen to test on all machines. I then decided to try to print a much larger version of it that would fill the entire objet platform.

The material behaviour is really interesting and I see this being really interesting for design / fashion although quite heavy with the current material I used (RGD450 Rigur).

I tried printing smaller scale chainmails (1/2, 1/3, 1/4, 1/5) but most of them broke when I used the waterjet to clean the support except for the 1/2 base circular design, which means finer results are possible.

3D scanning

For 3d scanning, I first tried to use the Sense 3D that was provided to scan my luffy figurine. Beyond the issues with holes and hard-to-reach regions, the scanning was tricky and easily needed a restart. However, the quality was pretty good from the color perspective as well as the geometry (captured in high-resolution mode).

I then attempted to do the same with Visual SFM by first capturing a bunch of pictures with my smartphone and then computing the 3d point cloud with these. The point cloud had a similar quality in terms of coverage of the picture, but it was much sparser and the underlying geometry much worse. The main problem seems to have been that the zoom feature is not fixed and could not be controlled, resulting in many different camera intrinsics that would not fit together well. This is much simpler when using a DSLR since we can fix the zoom and get a much more stable estimation of the camera intrinsics.

The result point cloud after using the dense reconstruction with CMVS is reasonable but not quite as good as that of Sense. This was to be expected given the engineering and parameter tweaking of Sense.