Contributed by Claudia Wessner, Makerspace Coordinator and Library Experience Designer
Huge progress has been made in the collaborative project between The Nest, the makerspace at Phillips Academy, and the Peabody Museum! From the first day we received the Next Engine 3D Scanner, we had hopes of testing out this new technology in a fun and interesting way. After talking to Marla Taylor, we both thought it was a no brainer to form a collaboration between the museum, the makerspace and a group of work duty students (Alana Gudinas ’16, Jacob Boudreau ’16, Mia LaRocca ’16, and Sarah Schmaier ’16) to further explore the scanning possibilities.
In a previous post, Marla discussed the parameters in which the artifacts were selected. Once we brought the artifacts over to the Nest, we were able to make custom stands for two of the three artifacts so that they would be stable on the scanner. Then we got to work!
So, how does it work?
Before the scanning starts, we set up preferences such as resolution, color mode, and the number of incremental scans, as well as positioning the object in the camera’s field of view. The higher the desired resolution, the longer the scanning process will take. Most of the objects we scanned took around one hour.
The scanner begins by taking a 2D image of the object then shoots out an array of red laser beams to capture the depth and texture of the object. Next, it completes a series of slow incremental rotations, based on the level of resolution selected, and performs the same 2D/3D rendering for each increment. The Next Engine software slowly builds the 3D model before your eyes as it layers the data captured by the scanner.
After the scanning is complete, the 3-dimensional model of the artifact appears in the Next Engine software. Depending on how the artifact is scanned, there may be some holes in the model. This would be where the lasers may not have been able to reach, such as the top or the bottom of the artifact. There are ways to avoid holes by completing several scans of the same object (top, bottom, full 360) and then fusing them together. This is something I am looking forward to experimenting with in the future, but for our initial exploration we did a single scan.
In order to fill the holes in our model, I “polished” it using the Next Engine software. The program will automatically find and select holes in the model. Then you can use a paintbrush tool to select the areas in which you’d like to fill. This can also take some time and experimentation, especially with very high resolution scans where image rendering can use a lot of computing power.
Once the editing of the model is complete, it is ready to prepare for printing by saving it as a .stl file and opening it in the Makerbot Desktop software. In the software you can scale, rotate, and place your object in the desired location on the build plate. You can also preview how long the print will take. This artifact, which was approximately four inches tall, took about 5 hours to print.
The Makerbot 3D printer uses a material called PLA that is stored in a spool in the back of the machine. The PLA is heated in an extruder and lays down very thin layers of material to build the object from the bottom up. Think of it like a glorified glue gun! The makerbot will automatically add “support material” that will support the object as it is printing so that everything stays intact. Once it is finished printing, any support material easily breaks off from the print.
We are so excited about the results of our project! We are looking forward to scanning more artifacts in the Peabody collection and refining our skills with this new technology! Stay tuned!