I’ve been working on my whitepaper and it’s been coming along great. https://docs.google.com/document/d/1vI7yHmIrmRv9ZyGRGoKuUc9yd8HdwKrqNnCSN8rjdtQ/edit


As the school year comes to an end, so does my time at WISRD hence the reason that I have to write a white paper. I have finished updating and finlizing my Vitae.


Over the past few days, I’ve been working on something called a white paper. A white paper is a piece you create for people in the future who want to learn about whatever topic you write about. My white paper will be on Blender.



As my time at WISRD is coming to an end, I’ve decided to try and learn how to solder. Its very hard and requires a lot of hard work and skill.


Yesterday we had our 2nd poster presentation. I lot of people came.



After experimenting with Unreal engine before learning it, I have come to the conclusion that this piece of software will be very vital for our group. The possibilities are endless.


I will try and learn how to use Unreal Engine.


I’m back from spring break and our group has already started working on our new spring poster.


I’ve done some file organization for our group scans.

Head on to the server, from there go to assets > AR/VR > Jetson_Nano > Scans there you can find all of our scanning progress.


Since the theatre was occupied today, Marley and I decided to give another go at scanning the gallery. Instead of scanning one big room or just scanning one wall, we thought it would be smarter to scan three separate wall three different times. This ended up being much more efficient and the scans came out a lot better. We learned that the ZED camera is very light sensitive.


Over the weekend, Marley and I found out what was wrong with importing textures onto our scan. Before scanning, there is an option to include texture to your scan. Now knowing that, we should be able to add textures and organic colors to the scan.


We were able to successfully scan half of the gallery and import it into blender however, we’ve ran into an issue. We are unable to import textures like everything else onto the scan.


I’ve figured out how to add textures to an object in blender. Another option besides .obj and .mtl files, is a file type called .glb. Its a 3D file format used in VR and AR.

If outlook(email) is blocked, sign in through onelogin (wildwoodschool.onelogin.com)



I’m now learning how to import .obj and .mtl files into Blender. Marley can now send me a scan from the ZED camera, and I will be able to import it into Blender. The problem we’re having is figuring out how to add the textures to the scan. When you import a .obj file, if it comes with texture there should be an option to add the textures to the scan.


Today, I’ve learned the basics of Onshape. I can built and remove objects, as well as edit the objects so I can include things like holes.


Important videos revolving around Onshape


Current Onshape Password;



I have finally finished my article and will now start my research on learning how to use Onshape.


I’m about a third of the way done with my article, and have found a much cheaper and better option for a cable for the oculus quest 2.

Article link:


personal priority list:

Finishing Article

Learning more about Blender


I have started my research for my article draft. My article will be about how different companies and jobs use VR to do their everyday activities, and to train other members.


I was looking around the workspace, and came across a master piece. A book called Practical Game Development with Unity and Blender by Alan Thorn.


I have decided to replicate the Soborg Wood base chair made by Fredericia.


I have decided to not work on the donut anymore. I feel as if I have learned enough and will now learn how to create a chair with texture.


We finally got the computer up and running with a brand new much bigger hard drive. With that, I’m installing the newest version of blender on this computer.


I am working on my WISRD pitch. I’m doing it on VR training. I have had a lot of progress in Blender. As of right now, I’m exploring all of Blenders possibility’s by making a donut.


Sam has gone to deal with the broken computer on his own. I’ve started to learn how to use Blender. Blender will help us patch holes in our 3D scans. Learning Blender is extremely difficult.


One of our pc’s didn’t have enough disk space, so we bought a 2 TB hard drive. We took the computer apart, and realized how dirty the entire thing was. We cleaned out everything and complete disassembled the computer.


We had our zoom meeting with Will and he’s suggested many helpful things. I’m gonna learn how to use blender. I’ve started to learn the fundamentals of Blender.


We have found a time where we can start the process of scanning the theatre room. However we are on a bit of a setback. We are trying a completely new recording engine called ZEDfu. Zedfu is much better so far in the sense of quality and is the most straight forward. We are also posting a new tweet. We have also learned how to import certain objects into AR (hololens)


Marley and I went to scan the Gallery, and have made some huge progress. However without the Jetson board Nano, it’s hard to scan everything. We also haven’t figured out how to save a recording on Unity.


Today, I was going to have my interview, however Joe couldn’t meet with me.


Marley and I scanned the Gallery and have gained huge progress. However every time we scan it, it deletes itself.


Skeletal tracking (ZED) 

The new software for the zed camera allows one to track skeletal systems in a human. It works by tracking the humans bones and the key points of the bones to track them. Key points of the bones is the two points ( bottom and top ) the initial length of the bone.


I have finished working on my CV and are ready for my interview. I have no clue what I’m facing, but I’m ready to take on any challenge. https://docs.google.com/document/d/1Tuli-srkfa5ujZevzaC8_wbE5P3cAVktOJ28HXf80XE/edit> Marley and I are in the process of finding an unoccupied classroom to conduct our scanning on. The theatre is almost always being used, which makes our lives a little more difficult.


I’m working on my CV for an upcoming interview on the 3rd of December (https://docs.google.com/document/d/1Tuli-srkfa5ujZevzaC8_wbE5P3cAVktOJ28HXf80XE/edit)


Today is the big day of the poster board presentation!!


We are just finishing up our poster board for our AR/VR group.


We have received our first semi successful 3D printed design of the shelf. However it wasn’t designed properly to fit on a wall. It would be impossible to get a screw in. We still like the way the sensor shelf looks, so we’re gonna just extend the support beams further back and make some room for the HTC VIVE sensor.



While I was waiting for Joe to approve a sensor design, I made a 3D render of what we want are going to make our workspace look like. You can see the two sensor above the white board.


Joe has given us a Microsoft HoloLens set, which might help us in our project. We also have received a ZED sensor however, we are missing a big component called a jetson board.


We have tried to 3D print the model shelves that Wills made for us, but it uses way to much filament and is hard on the printer. We have found a 3D scanning software called CANVAS. It fully scans rooms in full detail and import it into VR. We are planning to use this on the theatre group.


I’ve done some research on how AR/VR is being used today and how it’s being used in education. We’ve received the 3D model of the selves from Wills, however it’s not to scale and he didn’t incorporate the dimensions we asked. We are planning as a group to learn how to use a ZED 360 camera


I have mastered how to use Polycam, and tested it out on Marley’s backpack. Instead of taking only the bare minimum amount of photos, I took 35 as well as set the detail setting to full. I have still not received an email from Wills. The backpack came out much better than I expected, but it’s still a little messy. I have learned that you need at least 50+ images to get a good model. In the future, we are planning to import the scanned objects into FrameVr to help achieve our goals of creating a virtual theatre. This would be very helpful to the theatre group, because they struggle with space. The plan would be to be able to scan objects from the real world to the virtual world so the theatre group can test and see what pieces of furniture and where without having to buy the actual furniture yet.


I have learned how to use a software called Polycam that lets you take 20+ photos of an object and turns it into a 3D model. I started off by scanning a chair and only took 20 photos of it. It sort of worked out. I now know to take 30+ photos.


I’m planning on emailing a guy named Dawson who can help me get a 3D rendering of the shelves. I’ve just been told to email Wills about the 3D model.


All we need to do is get someone who knows how to CAD to get us a 3D model of our shelf idea. I’ve started to learn how to work a 3D modeling software called Onshape. However it is extremely difficult to understand. I’m watching the tutorials in the mean time.


We have found a location for the HTC sensors which to going to be on shelves that we’ll make. Marley has started to learn how to use FrameVr and we got a drawing of what we want our shelf layout to be. We’ve gotten the dimensions that we want for the shelves.


We’ve have come back to Engage and got it working again. We loaded up and classroom and learned how to import items and objects into the room. We’ve located all of the necessary wires to set up the Vive. However we’re struggling to locate a spot for the sensors.


We have successfully set up the Oculus headset and set it all up. We didn’t have much room for the sensors, so we set them up on top of the pc’s on the desk. We were told to learn how to use Engage software and for some reason isn’t working.


We’ve attempted to set up the HTC Vive, but have encountered a major problem. The wire connecting the sensors to the pc is far to short. So we’re setting up the Oculus.

21/08/23 – This is my first day at wisrd.