{"id":1831,"date":"2018-02-13T14:23:11","date_gmt":"2018-02-13T22:23:11","guid":{"rendered":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/?page_id=1831"},"modified":"2018-12-05T15:46:26","modified_gmt":"2018-12-05T23:46:26","slug":"tutorials","status":"publish","type":"page","link":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/tutorials\/","title":{"rendered":"Software Tutorials"},"content":{"rendered":"<p>[et_pb_section bb_built=&#8221;1&#8243; admin_label=&#8221;section&#8221; _builder_version=&#8221;3.0.47&#8243;][et_pb_row admin_label=&#8221;row&#8221; _builder_version=&#8221;3.0.47&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221;][et_pb_column type=&#8221;4_4&#8243;][et_pb_text _builder_version=&#8221;3.17.3&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; background_layout=&#8221;light&#8221;]<\/p>\n<p>\nThis page is under construction. Check back soon to see new tutorials.<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=&#8221;3.0.105&#8243;][et_pb_column type=&#8221;4_4&#8243;][et_pb_tabs _builder_version=&#8221;3.17.3&#8243;][et_pb_tab _builder_version=&#8221;3.17.3&#8243; title=&#8221;Bibliographic and Research Management Software&#8221; use_background_color_gradient=&#8221;off&#8221; background_color_gradient_start=&#8221;#2b87da&#8221; background_color_gradient_end=&#8221;#29c4a9&#8243; background_color_gradient_type=&#8221;linear&#8221; background_color_gradient_direction=&#8221;180deg&#8221; background_color_gradient_direction_radial=&#8221;center&#8221; background_color_gradient_start_position=&#8221;0%&#8221; background_color_gradient_end_position=&#8221;100%&#8221; background_color_gradient_overlays_image=&#8221;off&#8221; parallax=&#8221;off&#8221; parallax_method=&#8221;on&#8221; background_size=&#8221;cover&#8221; background_position=&#8221;center&#8221; background_repeat=&#8221;no-repeat&#8221; background_blend=&#8221;normal&#8221; allow_player_pause=&#8221;off&#8221; background_video_pause_outside_viewport=&#8221;on&#8221; tab_text_shadow_style=&#8221;none&#8221; body_text_shadow_style=&#8221;none&#8221; tab_text_shadow_horizontal_length=&#8221;0em&#8221; tab_text_shadow_vertical_length=&#8221;0em&#8221; tab_text_shadow_blur_strength=&#8221;0em&#8221; body_text_shadow_horizontal_length=&#8221;0em&#8221; body_text_shadow_vertical_length=&#8221;0em&#8221; body_text_shadow_blur_strength=&#8221;0em&#8221;]<\/p>\n<div><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/06\/zotero-logo.1519312231.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignleft wp-image-2501\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/06\/zotero-logo.1519312231.jpg\" alt=\"\" width=\"100\" height=\"30\"><\/a><\/div>\n<p><a href=\"https:\/\/www.zotero.org\/\">Zotero<\/a> (zoh-tare-roh) is a free, open-source, well-supported tool that is meant to be &#8220;your personal research assistant.&#8221; Originally, it was created as a bibliographic manager, but it has become much more powerful over the years and allows you to &#8220;collect, organize, cite, and share your research&#8221; in useful ways. Moreover, Zotero &#8220;is developed by an independent, nonprofit organization &#8230;. With Zotero, you always stay in control of your own data.&#8221;<\/p>\n<p>If you&#8217;re looking to streamline your scholarly research workflow; collect sources directly from the web; organize your annotations, PDFs, and notes in one place; create and manage citation styles and bibliographies with one click; and be assured that your data will be exportable; then Zotero is the tool for you.<\/p>\n<p>This tutorial will walk you through the process of installing Zotero, its browser connectors, and its most useful plugin for research management, &#8220;zotfile.&#8221; The combination of these three tools will give you the ability to easily collect, organize, cite, and share your research.<\/p>\n<p>[\/et_pb_tab][et_pb_tab title=&#8221;Textual Analysis&#8221; _builder_version=&#8221;3.17.3&#8243; use_background_color_gradient=&#8221;off&#8221; background_color_gradient_start=&#8221;#2b87da&#8221; background_color_gradient_end=&#8221;#29c4a9&#8243; background_color_gradient_type=&#8221;linear&#8221; background_color_gradient_direction=&#8221;180deg&#8221; background_color_gradient_direction_radial=&#8221;center&#8221; background_color_gradient_start_position=&#8221;0%&#8221; background_color_gradient_end_position=&#8221;100%&#8221; background_color_gradient_overlays_image=&#8221;off&#8221; parallax=&#8221;off&#8221; parallax_method=&#8221;on&#8221; background_size=&#8221;cover&#8221; background_position=&#8221;center&#8221; background_repeat=&#8221;no-repeat&#8221; background_blend=&#8221;normal&#8221; allow_player_pause=&#8221;off&#8221; background_video_pause_outside_viewport=&#8221;on&#8221; tab_text_shadow_style=&#8221;none&#8221; body_text_shadow_style=&#8221;none&#8221;]<\/p>\n<p>Under construction. Check back soon!<\/p>\n<p>[\/et_pb_tab][et_pb_tab title=&#8221;Geospatial &amp; GIS&#8221; _builder_version=&#8221;3.17.3&#8243; use_background_color_gradient=&#8221;off&#8221; background_color_gradient_start=&#8221;#2b87da&#8221; background_color_gradient_end=&#8221;#29c4a9&#8243; background_color_gradient_type=&#8221;linear&#8221; background_color_gradient_direction=&#8221;180deg&#8221; background_color_gradient_direction_radial=&#8221;center&#8221; background_color_gradient_start_position=&#8221;0%&#8221; background_color_gradient_end_position=&#8221;100%&#8221; background_color_gradient_overlays_image=&#8221;off&#8221; parallax=&#8221;off&#8221; parallax_method=&#8221;on&#8221; background_size=&#8221;cover&#8221; background_position=&#8221;center&#8221; background_repeat=&#8221;no-repeat&#8221; background_blend=&#8221;normal&#8221; allow_player_pause=&#8221;off&#8221; background_video_pause_outside_viewport=&#8221;on&#8221; tab_text_shadow_style=&#8221;none&#8221; body_text_shadow_style=&#8221;none&#8221;]<\/p>\n<p>Under construction. Check back soon!<\/p>\n<p>[\/et_pb_tab][et_pb_tab _builder_version=&#8221;3.17.3&#8243; title=&#8221;Reading Technical Documentation&#8221; use_background_color_gradient=&#8221;off&#8221; background_color_gradient_start=&#8221;#2b87da&#8221; background_color_gradient_end=&#8221;#29c4a9&#8243; background_color_gradient_type=&#8221;linear&#8221; background_color_gradient_direction=&#8221;180deg&#8221; background_color_gradient_direction_radial=&#8221;center&#8221; background_color_gradient_start_position=&#8221;0%&#8221; background_color_gradient_end_position=&#8221;100%&#8221; background_color_gradient_overlays_image=&#8221;off&#8221; parallax=&#8221;off&#8221; parallax_method=&#8221;on&#8221; background_size=&#8221;cover&#8221; background_position=&#8221;center&#8221; background_repeat=&#8221;no-repeat&#8221; background_blend=&#8221;normal&#8221; allow_player_pause=&#8221;off&#8221; background_video_pause_outside_viewport=&#8221;on&#8221; tab_text_shadow_style=&#8221;none&#8221; body_text_shadow_style=&#8221;none&#8221; tab_text_shadow_horizontal_length=&#8221;0em&#8221; tab_text_shadow_vertical_length=&#8221;0em&#8221; tab_text_shadow_blur_strength=&#8221;0em&#8221; body_text_shadow_horizontal_length=&#8221;0em&#8221; body_text_shadow_vertical_length=&#8221;0em&#8221; body_text_shadow_blur_strength=&#8221;0em&#8221;]<\/p>\n<p><a name=\"top-of-page\"><\/a><\/p>\n<p><strong>Available as <a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/ReadingTechnicalDocumentation.pdf\">a PDF<\/a> | or <\/strong><a href=\"https:\/\/github.com\/eltiffster\/readingDocs\"><strong>on Github<\/strong><\/a><\/p>\n<p>Technical documentation is any material meant to accompany a particular tool, software, or technology which can be descriptive (what problem does this solve?) or instructional (how-to). Although documentation (&#8220;docs&#8221;) comes in various formats, this guide focuses on software documentation in textual format published on the web.<\/p>\n<p>Here&#8217;s the most common scenario in which I read documentation: I&#8217;m just starting a project and have a specific goal in mind, but I don&#8217;t how to go about doing it. I may not even know what programming language, device, etc. to use. Can I do this with something I already have or should I install a dedicated application for it? If I want to make a <a href=\"https:\/\/www.youtube.com\/watch?v=x5hGF7NsG7Q\">sound-reactive light display<\/a>, should I use <a href=\"https:\/\/www.arduino.cc\/\">Arduino<\/a> or <a href=\"https:\/\/www.raspberrypi.org\/\">Raspberry Pi<\/a>? What physical parts (LEDS, microphone, etc.) do I need? Which seems less complicated and time-consuming?<\/p>\n<p>In short, I&#8217;m reading docs because I want to know what steps are involved and what the final product might look like\u2014without investing too much time and resources upfront or reinventing the wheel. However, reading documentation is like trying to find a word in a dictionary without knowing how to spell it: how do I find what I&#8217;m looking for when I don&#8217;t know what to look for?<\/p>\n<p>This guide walks through types of documentation, how to parse them, and common conventions in web documentation (e.g. README files on Github). It provides advice and tips to help you navigate docs and sift the dizzying amount of information on the internet. Although much of this advice holds true for many types of software, I&#8217;ll be using examples from <a href=\"https:\/\/www.python.org\/\">Python<\/a>, <a href=\"https:\/\/github.com\/jcjohnson\/torch-rnn\">Machine Learning<\/a>, <a href=\"https:\/\/github.com\/szweibel\/DHSI-API-workshop\/blob\/master\/command-line\/sections\/what-is-the-command-line.md\">Command Line<\/a>, <a href=\"https:\/\/www.arduino.cc\/\">Arduino<\/a>, <a href=\"https:\/\/www.w3schools.com\/js\/js_intro.asp\">JavaScript<\/a>, and <a href=\"http:\/\/twinery.org\/\">Twine<\/a>.<\/p>\n<p>&nbsp;<\/p>\n<table style=\"width: 40%\">\n<tbody>\n<tr>\n<td>\n<h1>Overview<\/h1>\n<h3>Types of Documentation<\/h3>\n<ul>\n<li><a href=\"#tutorials\">Tutorials<\/a><\/li>\n<li><a href=\"#topical-guides\">Topical Guides<\/a><\/li>\n<li><a href=\"#reference-guides\">Reference Guides<\/a><\/li>\n<li><a href=\"#cookbooks\">Cookbooks<\/a><\/li>\n<li><a href=\"#help-forums\">Help Forums<\/a><\/li>\n<li><a href=\"#readme-files\">README Files<\/a><\/li>\n<\/ul>\n<h3>Reading and Using Documentation<\/h3>\n<ol>\n<li><a href=\"#1-narrowing-your-search-terms\">Narrowing Your Search Terms<\/a><\/li>\n<li><a href=\"#2-skim-it\">Skim It<\/a><\/li>\n<li><a href=\"#3-code-and-the-command-line\">Code and the Command Line<\/a><\/li>\n<li><a href=\"#4-copypaste\">Copy\/Paste<\/a><\/li>\n<li><a href=\"#5-useful-options-commands-or-keyboard-shortcuts\">Useful Options, Commands, or Keyboard Shortcuts<\/a><\/li>\n<li><a href=\"#6-if-you-think-there-must-be-an-easier-way-to-do-this-then-there-probably-is\">&#8220;There must be an easier way to do this&#8230;&#8221;<\/a><\/li>\n<li><a href=\"#7-print-early-print-often\">Print Early, Print Often<\/a><\/li>\n<li><a href=\"#8-start-small-and-scale-up\">Start Small and Scale Up<\/a><\/li>\n<li><a href=\"#9-understanding-error-messages\">Understanding Error Messages<\/a><\/li>\n<li><a href=\"#10-read-more-documentation\">Read More Documentation<\/a><\/li>\n<\/ol>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<h1 id=\"typesofdocumentation\">Types of Documentation<\/h1>\n<figure id=\"attachment_2648\" aria-describedby=\"caption-attachment-2648\" style=\"width: 1000px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs.png\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2648 size-large\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs-1000x629.png\" alt=\"\" width=\"1000\" height=\"629\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs-1000x629.png 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs-620x390.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs-768x483.png 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs-1080x679.png 1080w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/arduinoDocs.png 1436w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/a><figcaption id=\"caption-attachment-2648\" class=\"wp-caption-text\">The documentation page for Arduino, a physical computing technology.<\/figcaption><\/figure>\n<p>Documentation can look very VERY different from source to source, and not all documentation is well-written (arguably, most is not). Docs vary in organization and level of detail, among many other ways. What follows is a brief, non-exhaustive typology of stuff on the web:<\/p>\n<h3 id=\"tutorials\">Tutorials<\/h3>\n<p>Usually, step-by-step instructions to accomplish a specific task. Even if it&#8217;s not exactly what you&#8217;re looking for, similar projects might still have useful code you can modify. <a href=\"https:\/\/programminghistorian.org\/en\/lessons\/\">See examples<\/a> from Programming Historian.<\/p>\n<h3 id=\"topicalguides\">Topical Guides<\/h3>\n<p>Information about a specific subject or feature. <a href=\"https:\/\/internetarchive.readthedocs.io\/en\/latest\/items.html\">See example<\/a> about Items in the Internet Archives library and API.<\/p>\n<h3 id=\"referenceguides\">Reference Guides<\/h3>\n<p>This type of doc most resembles a literal dictionary. It typically provides a bare-bones description of a specific function or command. Especially useful if you know vaguely what you need but you forget the syntax (what you actually type in). <a href=\"https:\/\/www.arduino.cc\/reference\/en\/\">See example<\/a>: Arduino&#8217;s reference guide for functions.<\/p>\n<h3 id=\"cookbooks\">Cookbooks<\/h3>\n<p>A collection of code examples or recipes for a specific software. I&#8217;ve never found this organically online (examples are often rolled into other types of docs), but it seems pretty common in e-book or book format. You can find a lot of them through UVic Libraries and probably other library search portals too.<\/p>\n<h3 id=\"helpforums\">Help Forums<\/h3>\n<p><a href=\"https:\/\/stackoverflow.com\/\">Stack Overflow<\/a> is the most (in)famous and often most helpful, although you can also find stuff in GitHub discussions or sometimes even reddit.<\/p>\n<h3 id=\"readmefiles\">README Files<\/h3>\n<p>This is what you&#8217;ll find on GitHub. <a href=\"https:\/\/github.com\/jcjohnson\/torch-rnn\">See example<\/a>: torch-rnn docs by Justin Johnson. Most README files are split into sections:<\/p>\n<ul>\n<li><em>Installation:<\/em> How to install something step-by-step.<\/li>\n<li><em>Dependencies or Requirements:<\/em> Other things that need to be pre-installed for it to work (e.g. code libraries or other software).<\/li>\n<li><em>Support:<\/em> Whether the software requires a specific operating system (e.g. Windows) or version (e.g. Python 3 vs. Python 2).<\/li>\n<li><em>Getting Started, Quickstart, etc.:<\/em> Probably the most useful part of the doc. Usually contains simple examples to demonstrate a software&#8217;s capabilities.<\/li>\n<li><em>Examples:<\/em> This may be dispersed across sections or silently packaged in one folder named &#8220;examples&#8221; or &#8220;samples&#8221; or something similar.<\/li>\n<li><em>Features:<\/em> Things the software can do.<\/li>\n<li><em>License:<\/em> Any conditions or rules for distribution of the software.<\/li>\n<li><em>Updates or Release History:<\/em> Notes about new releases, fixes, and the like. If a piece of software seems outdated, it may be depreciated and non-functional.<\/li>\n<\/ul>\n<p><strong>Note:<\/strong> Most documentation incorporates aspects of more than one type. In fact, it&#8217;s rare to find a doc that doesn&#8217;t.<\/p>\n<p>&nbsp;<\/p>\n<h1 id=\"readingandusingdocumentation\">Reading and Using Documentation<\/h1>\n<figure id=\"attachment_2649\" aria-describedby=\"caption-attachment-2649\" style=\"width: 1000px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode.png\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2649 size-large\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode-1000x466.png\" alt=\"\" width=\"1000\" height=\"466\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode-1000x466.png 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode-620x289.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode-768x358.png 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode-1080x503.png 1080w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/optoCode.png 1150w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/a><figcaption id=\"caption-attachment-2649\" class=\"wp-caption-text\">Python code written in the Sublime Text Editor.<\/figcaption><\/figure>\n<h3 id=\"1narrowingyoursearchterms\">1. Narrowing Your Search Terms<\/h3>\n<p>This is half the battle, since knowing an accurate search term will yield better results. For example, say I want to make a light display with Arduino that flashes different colours. Googling &#8220;arduino light display sound&#8221; might give me a wide range of results, some of which are unrelated to my project. But based on that, I might find that &#8220;sound-reactive LED arduino&#8221; or &#8220;arduino music visualizer&#8221; is closer to what I&#8217;m looking for. Additionally, consider if you can refine your search using <a href=\"https:\/\/www.uvic.ca\/library\/research\/tips\/searchsmart\/index.php\">boolean operators or other methods<\/a>.<\/p>\n<h3 id=\"2skimit\">2. Skim It<\/h3>\n<p>No one reads documentation for the plot. Like other kinds of research, you&#8217;ll likely skim docs and slow down if something catches your eye. It&#8217;s often helpful to look at an overview or table of contents section first. You might even stumble across a solution to another problem in the process.<\/p>\n<h3 id=\"3codeandthecommandline\">3. Code and the Command Line<\/h3>\n<p>In case you haven&#8217;t guessed already:<code><br \/>\n<\/code><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block1.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-2652 size-full\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block1.png\" alt=\"\" width=\"902\" height=\"73\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block1.png 902w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block1-620x50.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block1-768x62.png 768w\" sizes=\"auto, (max-width: 902px) 100vw, 902px\" \/><\/a><\/p>\n<p>Code can also be <code>formatted inline<\/code>. Often, the code blocks contain commands for Linux\/Bash Command Line. You can find this in Terminal on a Mac or <a href=\"https:\/\/www.windowscentral.com\/how-install-bash-shell-command-line-windows-10\">Bash on Ubuntu on Windows<\/a> in Windows 10. For older Windows operating systems, you can try <a href=\"https:\/\/gitforwindows.org\/\">Git BASH<\/a>. (For more on the Command Line and what it&#8217;s for, see this <a href=\"https:\/\/github.com\/szweibel\/DHSI-API-workshop\/blob\/master\/command-line\/sections\/what-is-the-command-line.md\">excellent explanation<\/a> by Jojo Karlin, Jonathan Reeve, Patrick Smyth, Steven Zweibel. You can also play around with the Command Line in <a href=\"http:\/\/dhbox.org\/\">DH Box<\/a> without needing to set it up or accidentally deleting something.<\/p>\n<p>How can you tell if some code is meant for the Command Line? You might see things like this:<code class=\"bash language-bash\"><\/code><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block2.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-2653 size-full\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block2.png\" alt=\"\" width=\"897\" height=\"250\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block2.png 897w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block2-620x173.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block2-768x214.png 768w\" sizes=\"auto, (max-width: 897px) 100vw, 897px\" \/><\/a><\/p>\n<p>With most code blocks, the documentation writer likely won&#8217;t specify what language it&#8217;s in. Readers are expected to know from the syntax of the commands and this can be confusing for beginners.<\/p>\n<p>Let&#8217;s look at an IRL example from the <a href=\"https:\/\/github.com\/jcjohnson\/torch-rnn\">torch-rnn documentation<\/a> written by Justin Johnson:<code class=\"bash language-bash\"><\/code><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block3.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-2654\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block3.png\" alt=\"\" width=\"896\" height=\"225\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block3.png 896w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block3-620x156.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block3-768x193.png 768w\" sizes=\"auto, (max-width: 896px) 100vw, 896px\" \/><\/a><\/p>\n<p>This is a series of commands you would type into Command Line (having installed Lua). Note that you have to press Enter to run something in Command Line, so make sure you do that after every line. The words <code>#after the hash<\/code> are comments (<a href=\"#5-useful-options-commands-or-keyboard-shortcuts\">we&#8217;ll talk about them later<\/a>), so you don&#8217;t need to type them into the Command Line console.<\/p>\n<h3 id=\"4copypaste\">4. Copy\/Paste<\/h3>\n<p>When I talk to people in my home department (English Literature), some are reluctant to copy\/paste and use code they find on the internet in their own programs. In the Humanities, we&#8217;re taught that using other people&#8217;s work without giving credit, formatted according to specific conventions, is plagiarism and should be avoided.<\/p>\n<p>For better or worse, this is generally not the way coders operate. If anything, copy\/pasting code is encouraged. Not only does it save you time and effort (and possibly a lot of frustration), but you can be reasonably confident that it will work and that you haven&#8217;t introduced any errors by mistyping something. Of course, this should be balanced with opportunities to learn by writing your own code (even manually type out something you&#8217;ve found can be a good learning experience).<\/p>\n<p>However, chances are that you&#8217;ll write a bit of original anyway, since you&#8217;ll probably have to modify anything you find to suit the situation at hand. At the very least, you should change variable names (<a href=\"https:\/\/github.com\/szweibel\/DHSI-API-workshop\/blob\/master\/python\/sections\/variables.md\">what&#8217;s a variable?<\/a>) to something more descriptive for your project.<\/p>\n<h3 id=\"5usefuloptionscommandsorkeyboardshortcuts\">5. Useful Options, Commands, or Keyboard Shortcuts<\/h3>\n<p>Here are some tips for working with code or text editors such as <a href=\"https:\/\/notepad-plus-plus.org\/\">Notepad++<\/a>, <a href=\"https:\/\/www.sublimetext.com\/\">Sublime<\/a>, or <a href=\"https:\/\/atom.io\/\">Atom<\/a>.<\/p>\n<p><strong>Note:<\/strong> The exact keys vary depending on the editor and operating system (the below is meant for Windows PC). If you&#8217;re using a Mac, replace Cntrl with the Command button.<\/p>\n<h4 id=\"awordwrap\"><strong>a. Word Wrap<\/strong><\/h4>\n<p>This is usually under the View dropdown menu in your text editor. Basically, it shifts words to a new line when it would otherwise continue outside the window\u2014it&#8217;s similar to how Microsoft Word automatically jumps to a new line once you&#8217;ve reached the maximum width of your document. Even if you have word wrap enabled, the code will still execute as if it were all written on the same line (this is very important in coding!).<\/p>\n<h4 id=\"bfindandreplacecntrlh\"><strong>b. Find and Replace: Cntrl + h<\/strong><\/h4>\n<p>This is especially helpful for renaming variables, which can happen if you discover that a name conflicts with something else in your code, or for other reasons. Many text editors have extra options such as matching case.<\/p>\n<h4 id=\"cindentationcntrlincreaseindentorcntrldecreaseindent\"><strong>c. Indentation: Cntrl + [ (increase indent) or Cntrl + ] (decrease indent)<\/strong><\/h4>\n<p>Indentation is meaningful in most, if not all, coding languages. For example, code inside <a href=\"http:\/\/www.openbookproject.net\/books\/bpp4awd\/ch04.html\">conditionals<\/a> or <a href=\"https:\/\/www.tutorialspoint.com\/computer_programming\/computer_programming_loops.htm\">loops<\/a> often needs to be indented. As a visual cue, it&#8217;s also more human-readable that way!<\/p>\n<h4 id=\"dcommentingoutanduncommentingcntrl\"><strong>d. Commenting Out and Uncommenting: Cntrl + \/<\/strong><\/h4>\n<p>To &#8220;comment out&#8221; something means to turn a block of code into a comment, thereby making it inert or dormant (the computer won&#8217;t run it). More specifically, changing that code into a comment signals the computer to skip those lines when running the program. &#8220;Uncommenting&#8221; is the reverse: your comment becomes executable code.<\/p>\n<p>Some examples of comment format\/syntax:<\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block4.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-2655\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block4.png\" alt=\"\" width=\"895\" height=\"169\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block4.png 895w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block4-620x117.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block4-768x145.png 768w\" sizes=\"auto, (max-width: 895px) 100vw, 895px\" \/><\/a><\/p>\n<p>This is a handy and non-destructive way of &#8220;erasing&#8221; code without actually deleting it. Sometimes, once I&#8217;ve progressed to a certain stage, I create a duplicate or back up version of some code and comment it out. That way, if I play around with the code further and get stuck, I can always return to a clean copy that I know still works.<\/p>\n<p>You can also use this technique to alternate between two options by commenting out the option you don&#8217;t need and potentially uncommenting it later when you need it. For example, say I want a <a href=\"https:\/\/github.com\/eltiffster\/authorFunction\/blob\/master\/code\/cleanup.py\">Python script<\/a> that I can apply either to a list of specific files or to every file in a specific directory. I could switch between the two like this:<\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/demo2.gif\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-2645\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/demo2.gif\" alt=\"\" width=\"600\" height=\"403\"><\/a><\/p>\n<p>Additionally, you can print variables or values (see below) to the screen to help with debugging and comment them out after.<\/p>\n<h3 id=\"6ifyouthinktheremustbeaneasierwaytodothisthenthereprobablyis\">6. If you think, &#8220;There must be an easier way to do this,&#8221; then there probably is<\/h3>\n<p>As I mentioned at the beginning of this guide, avoid reinventing the wheel wherever possible. In practical terms, this might mean searching for <a href=\"https:\/\/www.techopedia.com\/definition\/3828\/software-library\">a code library<\/a> that does what you&#8217;re looking for, rather than assuming you need to write something completely from scratch. If one library doesn&#8217;t have a built-in option for your specific use case, a similar one might. Similarly, if a would-be solution is hard to get working, don&#8217;t feel the need to make it work with brute force. There may be a different, less troublesome solution elsewhere.<\/p>\n<p>When reading documentation, look for examples, screenshots, or videos that show you exactly what the end result of some code or process looks like. This will not only help with debugging by comparing what you expect to what you get; it&#8217;ll also help you decide if what you get is close enough what you&#8217;re looking for\u2014or if you want to look elsewhere.<\/p>\n<h3 id=\"7printearlyprintoften\">7. Print Early, Print Often<\/h3>\n<p>Printing things to the screen or console is usually one of the first command you learn in programming. Examples:<code><br \/>\n<\/code><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block5.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-2656\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block5.png\" alt=\"\" width=\"898\" height=\"267\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block5.png 898w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block5-620x184.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block5-768x228.png 768w\" sizes=\"auto, (max-width: 898px) 100vw, 898px\" \/><\/a><\/p>\n<p>As mentioned above, you can print variables or values. This is helpful for checking if the output of a function is what you expected it to be. If you have a complex function with several discrete steps, or you pass the output of one function into the input of another, printing the output of each step\/function can save you a lot of frustration. Otherwise, if something breaks, you&#8217;ll have a harder time figuring out where it goes wrong!<\/p>\n<p>Yes, there will probably be an <a href=\"#9-understanding-error-messages\">error message<\/a> pointing to a specific line or spot in your code anyway, but I still prefer to know earlier where possible. A decision you make in solving the error may have consequences in other code elsewhere (e.g. choosing to ditch a specific library).<\/p>\n<p>For example, I like to leave a print command, commented out, floating around at the end of <a href=\"https:\/\/github.com\/eltiffster\/authorFunction\/blob\/master\/code\/cleanup.py\">some code<\/a> like this:<code class=\"python language-python\"><br \/>\n<\/code><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block6.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-2651\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block6.png\" alt=\"\" width=\"892\" height=\"287\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block6.png 892w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block6-620x199.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/07\/block6-768x247.png 768w\" sizes=\"auto, (max-width: 892px) 100vw, 892px\" \/><\/a><\/p>\n<h3 id=\"8startsmallandscaleup\">8. Start Small and Scale Up<\/h3>\n<p>Another way to avoid frustration is to start with the smallest scope possible (in software development circles, called a <a href=\"https:\/\/en.wikipedia.org\/wiki\/Minimum_viable_product\">Minimum Viable Product<\/a> or MVP), and then increase the complexity gradually. I recommend saving the MVP separately before messing with it further. That way, you always have a workable version at hand.<\/p>\n<p>Writing or testing code is an iterative or recursive process. It&#8217;s been described as pushing a broken car down a hill, tinkering under the hood, and then pushing it back up and down the hill again. Put simply, coders run their programs over and over and over again before they work satisfactorily. This might seem odd to humanities scholars when we obsess over just the right words or turn of phrase. Although writing is also an iterative or recursive process, we often go through several stages of revision before &#8220;testing&#8221; it on anyone, especially if we&#8217;re perfectionists.<\/p>\n<p>However, in coding, writing big chunks of code before or without testing can end up working against you since we wind up in the same conundrum as in number 6: you know there&#8217;s an error, but you don&#8217;t quite know what it is.<\/p>\n<h3 id=\"9understandingerrormessages\">9. Understanding Error Messages<\/h3>\n<p>If you&#8217;ve spent time <a href=\"#4-copypaste\">copy\/pasting solutions<\/a> from the web, you may have noticed that said solutions seem to introduce their own errors. Here are some frequent examples:<\/p>\n<table>\n<thead>\n<tr>\n<th>Error<\/th>\n<th>Sample Error Message<\/th>\n<th>Description<\/th>\n<th>Possible Fix<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Missing library<\/td>\n<td><em>In Arduino:<\/em><br \/>\n<code>error: 'FastLED' was not declared in this scope<\/code><\/td>\n<td>Happens when you try to use a library that you don&#8217;t have installed or forgot to include\/import it.<\/td>\n<td>Go to Sketch &gt; Include Library &gt; Manage Libraries and install the library you need (here, it&#8217;s FastLED). If you installed the library but it&#8217;s still not working, you may have forgotten to include it with <code>#include&lt;libraryName.h&gt;<\/code> (where libraryName is name of the library).<\/td>\n<\/tr>\n<tr>\n<td>Missing variable<\/td>\n<td><em>In Twine:<\/em><br \/>\n<code>Error: &lt;&lt;print&gt;&gt;: bad evaluation: myVar is not defined<\/code><\/td>\n<td><code>_____ is not defined<\/code> is a classic case of the missing variable. It happens when you call a variable without having defined or declared it (i.e. assigned it a value) beforehand.<\/p>\n<p>If you copy\/paste something from a help forum, this can happen because whoever provided the solution used arbitrary variable names for the sake of demonstration.<\/p>\n<p>There could also be a <a href=\"http:\/\/python-textbok.readthedocs.io\/en\/1.0\/Variables_and_Scope.html\">scope issue<\/a>.<\/td>\n<td>Declare the variable somewhere in the code ahead (in an earlier line) than where you need it. Note that this will also depend on <a href=\"http:\/\/python-textbok.readthedocs.io\/en\/1.0\/Variables_and_Scope.html\">the scope<\/a> you want your variable to have. Do you need to use the variable in multiple functions? Do you want the its value to be the same for each of the functions or change as the program runs?<\/p>\n<p>It&#8217;s helpful to understand <a href=\"https:\/\/www.arduino.cc\/reference\/en\/language\/variables\/variable-scope--qualifiers\/scope\/\">global vs. local variables<\/a> too, but be aware that different programming languages or applications might handle scope differently.<\/td>\n<\/tr>\n<tr>\n<td>You&#8217;re not my type<\/td>\n<td><em>In Python:<\/em><br \/>\n<code>TypeError: Can't convert 'int' object to str implicitly<\/code><\/td>\n<td>For many coding languages, variables and values can be sorted into <a href=\"https:\/\/www.datacamp.com\/community\/tutorials\/data-structures-python#primitive\">several data types<\/a> (e.g. integer, string, list). When you try to perform an operation on one type of data that is meant for another type, you get this error.<\/p>\n<p>In this example, Python is telling you that you&#8217;re trying to treat an integer (a whole number) as if it were a string (of letters or characters or, in other words, text). This error occurs if you try to drop an integer into a prose statement.<\/td>\n<td>Many programming languages have built-in ways of converting one data type to another. Here, you&#8217;d probably want the <a href=\"https:\/\/developers.google.com\/edu\/python\/strings\">str() method<\/a>.<\/p>\n<p>You might also want to double-check what data type something is to see if you&#8217;ve declared it correctly. If you&#8217;re not sure what type a variable is in Python, you can use the <a href=\"http:\/\/www.diveintopython.net\/power_of_introspection\/built_in_functions.html\">type()<\/a> method, which returns the type of a specific variable\/value.<\/td>\n<\/tr>\n<tr>\n<td>Syntax error<\/td>\n<td><em>In JavaScript:<\/em><br \/>\n<code>SyntaxError: missing ; before statement<\/code><\/td>\n<td>This is maybe the most annoying problem of the bunch but also the easiest to fix. Chances are, you missed a punctuation mark somewhere. (A classic one is a missing semicolon at the end of a line.)<\/td>\n<td>Pretty straightforward: find and correct the error. Sometimes copy\/pasting the search error into Google will bring up a <a href=\"https:\/\/stackoverflow.com\/\">Stack Overflow<\/a> question from someone who made a similar misstep.<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h3 id=\"10readmoredocumentation\">10. Read More Documentation<\/h3>\n<p>Hopefully these tips help, but there&#8217;s no real substitute for reading lots of documentation yourself and experimenting with code. Given how much documentation varies from source to source, it&#8217;s impossible to anticipate every potential situation you could encounter.<\/p>\n<p>Over time, you&#8217;ll get a feel for the scope and complexity of different projects. Where possible, try to come up with a project driven by your own interests rather than an arbitrary exercise. Like learning any language, even a programming language, navigating documentation is a skill earned through practice and the motivation to make\/say something meaningful to you.<\/p>\n<p><a href=\"#top-of-page\"><strong>Back to Top \u2191<\/strong><\/a><\/p>\n<p>[\/et_pb_tab][et_pb_tab _builder_version=&#8221;3.17.3&#8243; title=&#8221;3D (Structured Light) Scanner&#8221; link_option_url_new_window=&#8221;off&#8221; use_background_color_gradient=&#8221;off&#8221; background_color_gradient_start=&#8221;#2b87da&#8221; background_color_gradient_end=&#8221;#29c4a9&#8243; background_color_gradient_type=&#8221;linear&#8221; background_color_gradient_direction=&#8221;180deg&#8221; background_color_gradient_direction_radial=&#8221;center&#8221; background_color_gradient_start_position=&#8221;0%&#8221; background_color_gradient_end_position=&#8221;100%&#8221; background_color_gradient_overlays_image=&#8221;off&#8221; parallax=&#8221;off&#8221; parallax_method=&#8221;on&#8221; background_size=&#8221;cover&#8221; background_position=&#8221;center&#8221; background_repeat=&#8221;no-repeat&#8221; background_blend=&#8221;normal&#8221; allow_player_pause=&#8221;off&#8221; background_video_pause_outside_viewport=&#8221;on&#8221; tab_text_shadow_style=&#8221;none&#8221; body_text_shadow_style=&#8221;none&#8221; hover_transition_duration=&#8221;300ms&#8221; hover_transition_delay=&#8221;0ms&#8221; hover_transition_speed_curve=&#8221;ease&#8221; tab_text_shadow_horizontal_length=&#8221;0em&#8221; tab_text_shadow_vertical_length=&#8221;0em&#8221; tab_text_shadow_blur_strength=&#8221;0em&#8221; body_text_shadow_horizontal_length=&#8221;0em&#8221; body_text_shadow_vertical_length=&#8221;0em&#8221; body_text_shadow_blur_strength=&#8221;0em&#8221;]<\/p>\n<table style=\"width: 40%\">\n<tbody>\n<tr>\n<td>\n<h1>Overview<\/h1>\n<ul>\n<li>Should I Use This Scanner?\n<ul>\n<li><a href=\"#helpful\">When it&#8217;s Helpful<\/a><\/li>\n<li><a href=\"#not-helpful\">When it&#8217;s Not Helpful<\/a><\/li>\n<li><a href=\"#photogrammetry\">Structured Light Scanning vs. Photogrammetry<\/a><\/li>\n<li><a href=\"#examples\">Example Research or Teaching Scenarios<\/a><\/li>\n<\/ul>\n<\/li>\n<li>How to Use the Scanner\n<ul>\n<li><a href=\"#workflow\">Workflow<\/a><\/li>\n<li><a href=\"#without-rotary\">Scanning without the Rotary Table<\/a><\/li>\n<li><a href=\"#with-rotary\">Scanning with the Rotary Table<\/a><\/li>\n<li><a href=\"#post-processing\">Post-processing<\/a><\/li>\n<ul>\n<li><a href=\"#aligning\">Aligning and Combining Meshes<\/a><\/li>\n<li><a href=\"#finalizing\">Finalizing, Exporting Model<\/a><\/li>\n<\/ul>\n<\/ul>\n<\/li>\n<li><a href=\"#shortcuts\">Mouse and Keyboard Shortcuts<\/a><\/li>\n<li><a href=\"#acknowledgments\">Acknowledgements<\/a><\/li>\n<li><a href=\"#references\">References and Further Resources<\/a><\/li>\n<\/ul>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Should I use this scanner?<\/h2>\n<h3 id=\"helpful\"><i><span style=\"font-weight: 400\">This scanner is especially helpful when\u2026<\/span><\/i><\/h3>\n<p><b>The object you want to scan is unusually small or detailed. <\/b><span style=\"font-weight: 400\">The HDI 120 is excellent at recording complex shapes or objects smaller than a baseball, which might be difficult to scan by other means. You can also move the scanner, which is mounted on a tripod, to capture features from different angles.<\/span><\/p>\n<p><b>Surface texture is integral to the model\u2019s success. <\/b><span style=\"font-weight: 400\">Even surfaces that look smooth to the naked eye, such as pages from a book, appear textured to the scanner. This texture is visible in the resulting model. For example, in <\/span><a href=\"https:\/\/sketchfab.com\/models\/47fb0fa585e44d35952d657f7a062830\"><span style=\"font-weight: 400\">a model of an 1898 book<\/span><\/a><span style=\"font-weight: 400\"> (by William Morris), you can see the impressions the type and woodblocks made during the printing process.<\/span><\/p>\n<figure id=\"attachment_2793\" aria-describedby=\"caption-attachment-2793\" style=\"width: 620px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2793 size-medium\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR-620x349.jpg\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR-620x349.jpg 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR-768x432.jpg 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR-1000x563.jpg 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/IMG_20181011_135852867_HDR-1080x608.jpg 1080w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><figcaption id=\"caption-attachment-2793\" class=\"wp-caption-text\">A screenshot of the software interface for the HDI 120 scanner. Note the model&#8217;s surface detail and texture.<\/figcaption><\/figure>\n<p><b>You want to create a watertight model for 3D-printing or you want to modify the model after scanning. <\/b><span style=\"font-weight: 400\">Flexscan 3D, the software interface for scanning and processing the model, can fill in holes automatically or using more fine-grained options. You can export models in STL, OBJ, or other formats and open them in 3D-modelling software such as <\/span><a href=\"https:\/\/www.tinkercad.com\/\"><span style=\"font-weight: 400\">Tinkercad<\/span><\/a><span style=\"font-weight: 400\">, <\/span><a href=\"https:\/\/www.sketchup.com\/\"><span style=\"font-weight: 400\">SketchUp<\/span><\/a><span style=\"font-weight: 400\">, or <\/span><a href=\"https:\/\/www.blender.org\/\"><span style=\"font-weight: 400\">Blender<\/span><\/a><span style=\"font-weight: 400\"> to make modifications (we offer <\/span><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/workshops\/\"><span style=\"font-weight: 400\">free workshops<\/span><\/a><span style=\"font-weight: 400\"> in Tinkercad). Or, you can insert them directly into <\/span><a href=\"https:\/\/www.makerbot.com\/3d-printers\/makerbot-print\/\"><span style=\"font-weight: 400\">MakerBot Print<\/span><\/a>&nbsp;and email us the .print file (see our <a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/how-to-3d-print\/\">3D Printing FAQ page<\/a>)<span style=\"font-weight: 400\">.<\/span><\/p>\n<figure id=\"attachment_2797\" aria-describedby=\"caption-attachment-2797\" style=\"width: 620px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2797 size-medium\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis-620x349.jpg\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis-620x349.jpg 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis-768x432.jpg 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis-1000x563.jpg 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/luigis-1080x608.jpg 1080w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><figcaption id=\"caption-attachment-2797\" class=\"wp-caption-text\">A Luigi figurine (right) next to a 3D-printed version of the same. The model was scanned using the HDI 120 scanner and FlexScan 3D (scanning and processing software).<\/figcaption><\/figure>\n<p><b>You are scanning in an environment with poor lighting or where you cannot place markers on or around your object. <\/b><span style=\"font-weight: 400\">In photogrammetry, one popular method of 3D scanning, a model\u2019s success depends on the quality of the images and the number of distinct reference points such as markers placed on or around the object. However, in some cases, too bright lighting can damage an object or surface, or it may be difficult or undesirable to attach markers to the object itself. Scanning with the HDI 120 avoids these problems.<\/span><\/p>\n<h3 id=\"not-helpful\"><i><span style=\"font-weight: 400\">This scanner is NOT helpful when\u2026<\/span><\/i><\/h3>\n<p><b>You want to scan large objects or recreate features such as architecture or landscapes. <\/b><span style=\"font-weight: 400\">The HDI 120 works best with relatively portable objects and it comes with a rotary table, which you can use to rotate and scan an object automatically. To fit on the rotary table, an object should be no bigger than a basketball. So long as an object fits in our scanning room, you can scan it without a rotary table, although this would require more post-processing. Recreating architecture or landscapes is better suited to software such as <\/span><a href=\"https:\/\/www.sketchup.com\/\"><span style=\"font-weight: 400\">SketchUp<\/span><\/a><span style=\"font-weight: 400\"> or 3D-mapping with drones or aerial photographs.<\/span><\/p>\n<p><b>You want to scan persons or moving objects. <\/b><span style=\"font-weight: 400\">Each scan takes at least a few seconds on the HDI 120. This makes the scanner unsuitable for capturing moving objects, even if the object (or person) moves only a few degrees.<\/span><\/p>\n<p><b>Colour is integral to the the model\u2019s success.<\/b><span style=\"font-weight: 400\"> Although there\u2019s limited support in Flexscan 3D for overlaying images or patterns on the model\u2019s surface, capturing colour is better suited to photogrammetry or a structured light scanner with photographic capabilities (such as the GoScan 20 in the library\u2019s <\/span><a href=\"https:\/\/www.uvic.ca\/library\/featured\/digitization\/index.php\"><span style=\"font-weight: 400\">Digitization Centre<\/span><\/a><span style=\"font-weight: 400\">).<\/span><\/p>\n<p><b>The object can be broken down into simple shapes (e.g. cubes, spheres, or pyramids). <\/b><span style=\"font-weight: 400\">In this case, it may be easier to create a model from scratch in a program such as <\/span><a href=\"https:\/\/www.tinkercad.com\/\"><span style=\"font-weight: 400\">Tinkercad<\/span><\/a><span style=\"font-weight: 400\"> rather than scanning the object. This could have the added benefit of a smooth surface rather than the grainy texture the HDI 120 would provide.<\/span><\/p>\n<h3 id=\"photogrammetry\"><span style=\"font-weight: 400\">How does this compare to photogrammetry?<\/span><\/h3>\n<p><span style=\"font-weight: 400\">Photogrammetry (<\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Photogrammetry\"><span style=\"font-weight: 400\">Wikipedia<\/span><\/a><span style=\"font-weight: 400\">) relies on taking 2D photographs and using reference points that appear in more than one image in order to construct a 3D model. Examples of photogrammetry software include <\/span><a href=\"http:\/\/www.agisoft.com\/\"><span style=\"font-weight: 400\">PhotoScan<\/span><\/a><span style=\"font-weight: 400\"> and <\/span><a href=\"https:\/\/www.aniwaa.com\/product\/3d-scanners\/smartmobilevision-scann3d\/\"><span style=\"font-weight: 400\">Scann3D<\/span><\/a><span style=\"font-weight: 400\">. However, the HDI 120 doesn\u2019t use photogrammetry. Instead, it uses a process called structured light scanning (SLS): it triangulates points in space based on the way light distorts when hitting an object\u2019s surface. With structured light scanning, each scan results in a 3D mesh whereas, in photogrammetry, each \u201cscan\u201d is actually a 2D image.<\/span><\/p>\n<figure id=\"attachment_2798\" aria-describedby=\"caption-attachment-2798\" style=\"width: 620px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2798 size-medium\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03-620x504.jpg\" alt=\"\" width=\"620\" height=\"504\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03-620x504.jpg 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03-768x624.jpg 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03-1000x813.jpg 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03-1080x878.jpg 1080w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/scaling_03.jpg 1155w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><figcaption id=\"caption-attachment-2798\" class=\"wp-caption-text\">Image of a model of a dinosaur skeleton in photogrammetry software. The blue rectangles denote where each 2D image was taken with reference to the scanned object. Image credit Martin J. Pratt at martinjpratt.wordpress.com.<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400\">Because of this, you don\u2019t have as much control over individual scans in photogrammetry as you would with SLS. In photogrammetry, assembling the scans into a model is handled automatically by the software and this process is rather opaque. By comparison, you can take multiple scans of an object (each called a \u201cmesh\u201d) and piece them together in FlexScan 3D by clicking and dragging like assembling a 3D puzzle. You can select and move overlapping meshes into roughly the same position, then click \u201cAlign\u201d and FlexScan 3D will automatically align them more precisely. You can edit each mesh individually and\/or combine meshes into a finalized model. By contrast changing or editing the requisite photos in photogrammetry often means you have to run the entire assembly process over again without previewing the results.<\/span><\/p>\n<h3 id=\"examples\"><span style=\"font-weight: 400\">Example Research or Teaching Scenarios<\/span><\/h3>\n<p><span style=\"font-weight: 400\">In a museum or classroom setting, structured light scanning can be a non-invasive way for audiences to interact with an object both on and off-screen (with printed models), where the object might be too fragile or rare to handle. In some cases, museum staff may scan a model and use it as the basis for computer-aided restoration (see K\u0119sik et al.).<\/span><\/p>\n<p><span style=\"font-weight: 400\">A 3D-printed model may convey a sense of size, proportion, or texture that would be difficult to grasp when viewed on-screen. One example of this is the <\/span><a href=\"https:\/\/www.3dhotbed.info\/project\/\"><span style=\"font-weight: 400\">3Dhotbed project<\/span><\/a><span style=\"font-weight: 400\">, a collaboration between library professionals from universities in the United States. This project seeks to enhance book history instruction, chiefly through models of typecasting equipment that are difficult to find in their original form.<\/span><\/p>\n<figure id=\"attachment_2800\" aria-describedby=\"caption-attachment-2800\" style=\"width: 620px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2800 size-medium\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab-620x414.jpg\" alt=\"\" width=\"620\" height=\"414\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab-620x414.jpg 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab-768x512.jpg 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab-1000x667.jpg 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab-1080x720.jpg 1080w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/MLab.jpg 1150w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><figcaption id=\"caption-attachment-2800\" class=\"wp-caption-text\">Image of the scanner with a hand-carved model of a wooden skull. Image credit: Shaun MacPherson from the UVic MLab website (maker.uvic.ca\/scanning\/).<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400\">With regards to media history, the <\/span><a href=\"https:\/\/maker.uvic.ca\/\"><span style=\"font-weight: 400\">UVic MLab <\/span><\/a><span style=\"font-weight: 400\">has used the HDI 120 to produce models for research and experimentation. For example, <\/span><a href=\"https:\/\/maker.uvic.ca\/scanner\/\"><span style=\"font-weight: 400\">MLab researchers scanned<\/span><\/a><span style=\"font-weight: 400\"> late 19th-century telephone receivers and transmitters when remaking early magnetic recording experiments (1898). As part of the <\/span><a href=\"https:\/\/github.com\/uvicmakerlab\/earlyWearablesKit\"><span style=\"font-weight: 400\">Early Wearables Kit<\/span><\/a><span style=\"font-weight: 400\">, they also scanned a sculpture of a skull and shrank the model down in order to study whether a skull stick-pin, of the particular size and shape its inventor claimed, could really have functioned at that scale.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Structured light scanning is also used in disciplines such as archaeology, anatomy education, and zooarchaeology to document 3D objects or features both in labs and in the field. For example, Hess, MacDonald, and Valach <\/span><a href=\"https:\/\/rdcu.be\/87Qr\"><span style=\"font-weight: 400\">scanned models<\/span><\/a><span style=\"font-weight: 400\"> of two ancient Roman coins and praised structured light scanning for its precision, flexibility, and transportability compared to other methods (see references for more information).<\/span><\/p>\n<p>&nbsp;<\/p>\n<h2><span style=\"font-weight: 400\">How to Use the Scanner<\/span><\/h2>\n<h3 id=\"workflow\">Workflow<\/h3>\n<p>Here are the steps to generate a 360-degree model of an object:<b><\/b><\/p>\n<ul>\n<li><b>Setting up the scanner and collecting scans: <span style=\"font-weight: 400\">this involves positioning the scanner, calibrating the rotary table (if using).<\/span><\/b><\/li>\n<li><b>Post-processing: <span style=\"font-weight: 400\">this involves digitally stitching together the meshes, deleting unwanted geometry, and filling in any holes in the model (see \u201cNavigating the Software\u201d for more details)<\/span><\/b><\/li>\n<li><b><b>Exporting the model:<\/b><span style=\"font-weight: 400\"> Now you have a high-quality model that you can export as an .stl or .obj file and import it into 3D-modelling software. If all holes have been filled, the model is also watertight and can be 3D-printed by inserting the model into a <\/span><a style=\"font-weight: 500\" href=\"https:\/\/www.makerbot.com\/3d-printers\/makerbot-print\/\">MakerBot Print<\/a><span style=\"font-weight: 400\"> file.<\/span><\/b><\/li>\n<\/ul>\n<h3 id=\"without-rotary\">Scanning without the Rotary Table<\/h3>\n<ol>\n<li><b>Prepare and position the scanner.&nbsp;<\/b><span style=\"font-weight: 400\">Click on the \u201cProjects\u201d tab and create a new project or open an existing project to keep working on it. In The scanner\u2019s camera should turn on and you can see the live feed in the panel on the right side of the screen (see image below). Adjust the scanner so that the object is visible in the feed.<\/span><\/li>\n<li><b>Adjust the exposure.&nbsp;<\/b>In the camera feed, areas are coloured white, red, or blue (see images below).\n<ul>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Areas in <\/span><b>white<\/b><span style=\"font-weight: 400\"> are surfaces or geometry that the scanner will pick up well.<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Areas in <\/span><b>red<\/b><span style=\"font-weight: 400\"> are over-exposed, i.e. <\/span><b>too close<\/b><span style=\"font-weight: 400\"> to the scanner (with current settings) to pick up well. Lower the exposure or move the object further from the scanner.<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Areas in <\/span><b>blue<\/b><span style=\"font-weight: 400\"> are under-exposed, i.e.<\/span><b> too far away<\/b><span style=\"font-weight: 400\"> for the scanner (with current settings) to pick up well. Increase the exposure or move the object closer to the scanner.<\/span><\/li>\n<\/ul>\n<\/li>\n<\/ol>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/OverExposed.jpg\">&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp;<img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-2848\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/OverExposed.jpg\" alt=\"\" width=\"357\" height=\"271\"><\/a><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-2849\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/UnderExposed.jpg\" alt=\"\" width=\"356\" height=\"266\"><i><\/i><\/p>\n<p style=\"text-align: center\"><i><span style=\"font-weight: 400\">Images of a calibration sheet as seen through the live feed. Areas in red are over-exposed (left) while blue areas (right) are under-exposed. Images from the FlexScan 3D manual.<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400\">Above the live feed is the exposure panel. Click and drag the slider left or right to adjust the exposure and the live feed will change colour accordingly. You want as much of the object as possible to look white onscreen.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2851 alignleft\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/sidebar.jpg\" alt=\"\" width=\"215\" height=\"752\"><\/p>\n<p><span style=\"font-weight: 400\">FlexScan 3D has some options for automatically setting exposure. On the left-side panel (see image left), under \u201cEasy Scan,\u201d you can click \u201cAuto Exposure\u201d and FlexScan will try to optimize the exposure level for maximum detail and fidelity (results vary). After auto-adjusting, you can still manually tweak or fine-tune the exposure setting. You can also click the bright red \u201cScan\u201d button and see what mesh is actually generated. This scan can also help you decide what exposure works best.<\/span><\/p>\n<p><b>3. Scan the object, reposition the scanner, and adjust exposure as needed.&nbsp;<\/b>When you\u2019re ready, click \u201cScan\u201d and the scanner will scan the object and create a \u201cmesh.\u201d Reposition or rotate the object and then click \u201cScan\u201d again to generate another mesh. Repeat until you\u2019re satisfied that you\u2019ve captured all the surfaces and details you need.<\/p>\n<p><b>4. You\u2019re now ready for post-processing<span style=\"font-weight: 400\">. See the \u201cPost-Processing\u201d section.<\/span><\/b><\/p>\n<h3 id=\"with-rotary\"><span style=\"font-weight: 400\">Scanning with the Rotary Table<\/span><\/h3>\n<p><span style=\"font-weight: 400\">The rotary table is useful when you have a relatively small object and want to make a 360\u00b0 scan. Without the rotary table, you would have to manually rotate the object between scans. You can also do a c<\/span>ombination of scanning with and without the rotary table. For example, after using the rotary table, you can flip an object to scan the underside and then combine the meshes together.<\/p>\n<p><strong>1.&nbsp;Calibrate the rotary table.&nbsp;<\/strong>For the scanner to automatically generate meshes for an object on the rotary table, we must first&nbsp;calibrate the scanner so that it knows the distance from it to&nbsp;each point of the object on the rotary table. Place the calibration sheet (a checkerboard of white and black squares) on the book rest and place them close to the centre of the rotary table.<\/p>\n<p><span style=\"font-weight: 400\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/rotarytable.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2852 alignright\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/rotarytable.png\" alt=\"\" width=\"207\" height=\"136\"><\/a>On the left-side panel, check the box next to the \u201cRotary Table\u201d heading (see image left or right). Adjust the calibration sheet so that the grid is fully and clearly visible from both live feed windows (you may have to adjust the exposure). Try to align the crosshairs in both windows on the same square, although a little deviation is fine. Click \u201cCalibrate.\u201d The rotary table will begin to rotate and the scanner will scan the calibration sheet from different angles.<\/span><\/p>\n<p><span style=\"font-weight: 400\">FlexScan will say if calibration was successful or unsuccessful. If the calibration was unsuccessful, reposition the calibration sheet or adjust the exposure and try again. Once the scanner is successfully calibrated, avoid moving the rotary table or the scanner. If you have to move them, you can recalibrate the scanner (click \u201cRecalibrate\u201d in the Rotary panel).<\/span><\/p>\n<p><b>2. Scan the object and adjust exposure as needed<\/b><span style=\"font-weight: 400\">. The rotary panel allows you to select how many scans to take of the object in one full circle of the table. For example, if you select 8 scans (which is usually sufficient for a good model), then the scanner will take one scan every 45 degrees. When ready, continue to post-processing.<\/span><\/p>\n<p>&nbsp;<\/p>\n<h2 id=\"post-processing\"><span style=\"font-weight: 400\">Post-Processing<\/span><\/h2>\n<p><span style=\"font-weight: 400\">You should now have a bunch of meshes representing your object that need to be merged into a single model.<\/span><\/p>\n<h3 id=\"aligning\"><span style=\"font-weight: 400\">Aligning and Combining Meshes<\/span><\/h3>\n<ol>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\"><span style=\"font-weight: 400\">To select a single mesh, click its thumbnail on the left. The selected mesh will turn red and there will be a yellow sphere, surrounded by circular outlines in blue, green, and red (the x-, y-, and z-axes). Then you can move the mesh around the screen by clicking and dragging the yellow sphere, or you can rotate the mesh around the x-, y-, or z-axis by clicking and dragging a circle.<\/span><\/span>\n<figure id=\"attachment_2860\" aria-describedby=\"caption-attachment-2860\" style=\"width: 620px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align.png\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2860 size-medium\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align-620x384.png\" alt=\"Image of two scans of the same object aligned together, with one of them (in red) currently selected for moving or rotating.\" width=\"620\" height=\"384\" srcset=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align-620x384.png 620w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align-768x476.png 768w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align-1000x620.png 1000w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align-1080x669.png 1080w, https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/align.png 1199w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><figcaption id=\"caption-attachment-2860\" class=\"wp-caption-text\">Image of two scans of the same object aligned together, with one of them (in red) currently selected for moving or rotating.<\/figcaption><\/figure>\n<p>&nbsp;<\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Position and rotate a mesh over at least one other mesh, so that the two meshes overlap over a common point or feature. The fit doesn\u2019t have to be exact since FlexScan will automatically align the scans more exactly. Then select both or all meshes (use Cntrl-click to select multiple meshes) and click the \u201cAlign\u201d button at the top. Now the scans are grouped together. To unattach and re-align them (e.g. if they\u2019re misaligned), double-click on each thumbnail and you\u2019ll be able to select and align the scans again.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400\">3. To delete unwanted geometry by holding down Cntrl and clicking\/dragging the area you want to delete. Press the delete button to delete.<\/span><\/p>\n<p><span style=\"font-weight: 400\">4. Once two meshes are aligned, they are in the same alignment group. This means that you can select two meshes that you want to align, one in part of a group and one you want to merge into that group, and align the second mesh to all other scans in that group.<\/span><\/p>\n<p><span style=\"font-weight: 400\">5. When you\u2019re satisfied, you can select multiple meshes and combine them together (Click \u201cCombine\u201d). The advantage of combining meshes is that they will be \u201clocked\u201d together (you can double-click to unlock them). Eventually, all meshes should be combined into one model.<\/span><\/p>\n<p><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/alignbutton.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2862 alignnone\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/alignbutton.png\" alt=\"\" width=\"213\" height=\"174\"><\/a><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/combinebutton.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2863 alignnone\" src=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/12\/combinebutton.png\" alt=\"\" width=\"197\" height=\"125\"><\/a><\/p>\n<p style=\"text-align: center\"><i><span style=\"font-weight: 400\">Image of two scans of the same object aligned together, with one of them (in red) currently selected for moving or rotating.<\/span><\/i><\/p>\n<h3 id=\"finalizing\"><span style=\"font-weight: 400\">Finalizing and\/or Exporting the Model<\/span><\/h3>\n<p><span style=\"font-weight: 400\">6. Once all meshes have been combined, you can finalize the model by clicking \u201cFinalize.\u201d The finalize panel and options will appear. Here you can fill in small holes in your model or smooth its texture.<\/span><\/p>\n<p><span style=\"font-weight: 400\">7. Adjust the \u201cHole Filling\u201d slider as desired. Selecting \u201cHigh\u201d will fill in the holes completely. This is useful if you want to create a watertight model for 3D-printing. Note that Flexscan will make educated guesses based on the surrounding geometry in order to fill in holes. If the results aren\u2019t what you wanted, you may have to uncombine the scans and take additional scans before aligning, recombining, and finalizing the model again.<\/span><\/p>\n<p><span style=\"font-weight: 400\">8. You can also have more fine-grained control over hole filling. See <\/span><a href=\"https:\/\/tinyurl.com\/y7nezezt\"><span style=\"font-weight: 400\">tinyurl.com\/y7nezezt<\/span><\/a><span style=\"font-weight: 400\"> for more.<\/span><\/p>\n<p><span style=\"font-weight: 400\">9. If you want a smoother texture or a smaller (in terms of file size) model, you can adjust the \u201cSmoothing Options\u201d slider or check the \u201cDecimate\u201d box and click\/drag the slider left or right. The \u201cDecimate\u201d option makes the model less dense by systematically getting rid of some vertices in the model. The result is a very similar model that takes up much less storage space.<\/span><\/p>\n<p><span style=\"font-weight: 400\">10. You can also export the model (click \u201cExport\u201d along the top toolbar) as an .obj or .stl file (and other file extensions). You can insert this file into Makerbot Print or other 3D printing or design software. (To learn more about 3D Printing at the Digital Scholarship Commons, see <\/span><a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/how-to-3d-print\/\"><span style=\"font-weight: 400\">oac.uvic.ca\/dsc\/how-to-3d-print<\/span><\/a><span style=\"font-weight: 400\">.)<\/span><\/p>\n<h3 id=\"shortcuts\"><span style=\"font-weight: 400\">Mouse and Keyboard Shortcuts<\/span><\/h3>\n<table>\n<tbody>\n<tr>\n<td><b>Button or Keys<\/b><\/td>\n<td><b>Action<\/b><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Left click on empty space and drag<\/span><\/td>\n<td><span style=\"font-weight: 400\">Rotate around the model<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Center mouse button and drag<\/span><\/td>\n<td><span style=\"font-weight: 400\">Pan the model<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Click and drag the yellow circle (after selecting a mesh)<\/span><\/td>\n<td><span style=\"font-weight: 400\">Move the entire mesh, keeping the same orientation (i.e. without rotating)<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Click and drag along the green, blue, or red circles (after selecting the mesh)<\/span><\/td>\n<td><span style=\"font-weight: 400\">Rotate the mesh around an axis (X,Y, or Z)<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Cntrl + click and drag<\/span><\/td>\n<td><span style=\"font-weight: 400\">Select geometry<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Delete (after selecting geometry)<\/span><\/td>\n<td><span style=\"font-weight: 400\">Delete the selected geometry<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Hold down Cntrl + click on the thumbnails for each you want to select<\/span><\/td>\n<td><span style=\"font-weight: 400\">Select multiple meshes<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Select at least one mesh, then Cntrl + a<\/span><\/td>\n<td><span style=\"font-weight: 400\">Select all meshes<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400\">Click the revert button<\/span><\/td>\n<td><span style=\"font-weight: 400\">Undo an action (unfortunately, Flexscan 3D can only undo the most recent action)<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h3 id=\"acknowledgments\"><span style=\"font-weight: 400\">Acknowledgments<\/span><\/h3>\n<p>Thanks to the UVic MLab (especially Katherine Goertz, Danielle Morgan, and Jentery Sayers) and the Humanities Computing and Media Centre (HCMC).<\/p>\n<h3 id=\"references\"><span style=\"font-weight: 400\">References and Further Resources<\/span><\/h3>\n<p><span style=\"font-weight: 400\">Belojevic, Nina et al. <\/span><i><span style=\"font-weight: 400\">Early Wearables Kit (2015)<\/span><\/i><span style=\"font-weight: 400\">,<\/span> <span style=\"font-weight: 400\">Nov. 2015, <\/span><a href=\"https:\/\/github.com\/uvicmakerlab\/earlyWearablesKit\"><span style=\"font-weight: 400\">github.com\/uvicmakerlab\/earlyWearablesKit<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">Goertz, Katherine. \u201cHandling History with a Scanner.\u201d <\/span><i><span style=\"font-weight: 400\">UVic MLab<\/span><\/i><span style=\"font-weight: 400\">, May 15, 2016, <\/span><a href=\"https:\/\/maker.uvic.ca\/scanner\/\"><span style=\"font-weight: 400\">maker.uvic.ca\/scanner\/<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">Goertz, Katherine, Danielle Morgan, and Jentery Sayers. \u201c3D Scanning.\u201d <\/span><i><span style=\"font-weight: 400\">Physical Computing and Fabrication (DHSI 2016), <\/span><\/i><span style=\"font-weight: 400\">University of Victoria and GitHub, 2016.<\/span><a href=\"https:\/\/github.com\/uvicmakerlab\/dhsi2016\/blob\/master\/3Dscanning.md\"><span style=\"font-weight: 400\">github.com\/uvicmakerlab\/dhsi2016\/blob\/master\/3Dscanning.md<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">Hess, Mona et al. \u201cApplication Of Multi-modal 2D and 3D Imaging And Analytical Techniques to Document and Examine Coins on the Example Of Two Roman Silver Denarii.\u201d <\/span><i><span style=\"font-weight: 400\">Heritage Science<\/span><\/i><span style=\"font-weight: 400\">, vol. 6, no. 5, December 2018, <\/span><a href=\"https:\/\/link.springer.com\/article\/10.1186%2Fs40494-018-0169-2\"><span style=\"font-weight: 400\">link.springer.com\/article\/10.1186%2Fs40494-018-0169-2<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">Jacobs, Courtney, Marcia McIntosh and Kevin M. O\u2019Sullivan. <\/span><i><span style=\"font-weight: 400\">3Dhotbed project,<\/span><\/i><span style=\"font-weight: 400\"> n.d., <\/span><a href=\"https:\/\/www.3dhotbed.info\/project\/\"><span style=\"font-weight: 400\">www.3dhotbed.info\/project\/<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">K\u0119sik, Jacek et al. &#8220;An Approach To Computer-aided Reconstruction Of Museum Exhibits.&#8221; <\/span><i><span style=\"font-weight: 400\">Advances in Science and Technology Research Journal<\/span><\/i><span style=\"font-weight: 400\">, vol. 11, no. 2, 2017, pp. 87-94. doi:10.12913\/22998624\/69419, <\/span><a href=\"http:\/\/www.astrj.com\/An-approach-to-computer-aided-reconstruction-of-museum-exhibits,69419,0,2.html\"><span style=\"font-weight: 400\">astrj.com\/An-approach-to-computer-aided-reconstruction-of-museum-exhibits,69419,0,2.html<\/span><\/a><\/p>\n<p>LMI Calibration page (for calibrating the rotary table).&nbsp;<a href=\"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/Duo_CalibImage_10mm.pdf\">oac.uvic.ca\/dsc\/wp-content\/uploads\/sites\/2373\/2018\/11\/Duo_CalibImage_10mm.pdf<\/a><\/p>\n<p>LMI Technologies YouTube Channel (see videos on Flexscan 3D).&nbsp;<a href=\"https:\/\/www.youtube.com\/user\/LMITechnologies\/videos?flow=grid&amp;view=0&amp;sort=da\">youtube.com\/user\/LMITechnologies\/videos?flow=grid&amp;view=0&amp;sort=da<\/a><\/p>\n<p>&#8220;HDI 100 Series User Manual.&#8221; LMI Technologies. (Note: to download, you must create a free account with LMI.)&nbsp;<a href=\"https:\/\/downloads.lmi3d.com\/?brand=151&amp;product=6056\">downloads.lmi3d.com\/?brand=151&amp;product=6056<\/a><\/p>\n<p>[\/et_pb_tab][\/et_pb_tabs][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>[et_pb_section bb_built=&#8221;1&#8243; admin_label=&#8221;section&#8221; _builder_version=&#8221;3.0.47&#8243;][et_pb_row admin_label=&#8221;row&#8221; _builder_version=&#8221;3.0.47&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221;][et_pb_column type=&#8221;4_4&#8243;][et_pb_text _builder_version=&#8221;3.17.3&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; background_layout=&#8221;light&#8221;] This page is under construction. Check back [&hellip;]<\/p>\n","protected":false},"author":19673,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-1831","page","type-page","status-publish","hentry"],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/pages\/1831","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/users\/19673"}],"replies":[{"embeddable":true,"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/comments?post=1831"}],"version-history":[{"count":44,"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/pages\/1831\/revisions"}],"predecessor-version":[{"id":2872,"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/pages\/1831\/revisions\/2872"}],"wp:attachment":[{"href":"https:\/\/onlineacademiccommunity.uvic.ca\/dsc\/wp-json\/wp\/v2\/media?parent=1831"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}