Sam has gone to deal with the broken computer on his own. I’ve started to learn how to use Blender. Blender will help us patch holes in our 3D scans. Learning Blender is extremely difficult.


One of our pc’s didn’t have enough disk space, so we bought a 2 TB hard drive. We took the computer apart, and realized how dirty the entire thing was. We cleaned out everything and complete disassembled the computer.


We had our zoom meeting with Will and he’s suggested many helpful things. I’m gonna learn how to use blender. I’ve started to learn the fundamentals of Blender.


We have found a time where we can start the process of scanning the theatre room. However we are on a bit of a setback. We are trying a completely new recording engine called ZEDfu. Zedfu is much better so far in the sense of quality and is the most straight forward. We are also posting a new tweet. We have also learned how to import certain objects into AR (hololens)


Marley and I went to scan the Gallery, and have made some huge progress. However without the Jetson board Nano, it’s hard to scan everything. We also haven’t figured out how to save a recording on Unity.


Today, I was going to have my interview, however Joe couldn’t meet with me.


Marley and I scanned the Gallery and have gained huge progress. However every time we scan it, it deletes itself.


Skeletal tracking (ZED) 

The new software for the zed camera allows one to track skeletal systems in a human. It works by tracking the humans bones and the key points of the bones to track them. Key points of the bones is the two points ( bottom and top ) the initial length of the bone.


I have finished working on my CV and are ready for my interview. I have no clue what I’m facing, but I’m ready to take on any challenge. https://docs.google.com/document/d/1Tuli-srkfa5ujZevzaC8_wbE5P3cAVktOJ28HXf80XE/edit> Marley and I are in the process of finding an unoccupied classroom to conduct our scanning on. The theatre is almost always being used, which makes our lives a little more difficult.


I’m working on my CV for an upcoming interview on the 3rd of December (https://docs.google.com/document/d/1Tuli-srkfa5ujZevzaC8_wbE5P3cAVktOJ28HXf80XE/edit)


Today is the big day of the poster board presentation!!


We are just finishing up our poster board for our AR/VR group.


We have received our first semi successful 3D printed design of the shelf. However it wasn’t designed properly to fit on a wall. It would be impossible to get a screw in. We still like the way the sensor shelf looks, so we’re gonna just extend the support beams further back and make some room for the HTC VIVE sensor.



While I was waiting for Joe to approve a sensor design, I made a 3D render of what we want are going to make our workspace look like. You can see the two sensor above the white board.


Joe has given us a Microsoft HoloLens set, which might help us in our project. We also have received a ZED sensor however, we are missing a big component called a jetson board.


We have tried to 3D print the model shelves that Wills made for us, but it uses way to much filament and is hard on the printer. We have found a 3D scanning software called CANVAS. It fully scans rooms in full detail and import it into VR. We are planning to use this on the theatre group.


I’ve done some research on how AR/VR is being used today and how it’s being used in education. We’ve received the 3D model of the selves from Wills, however it’s not to scale and he didn’t incorporate the dimensions we asked. We are planning as a group to learn how to use a ZED 360 camera


I have mastered how to use Polycam, and tested it out on Marley’s backpack. Instead of taking only the bare minimum amount of photos, I took 35 as well as set the detail setting to full. I have still not received an email from Wills. The backpack came out much better than I expected, but it’s still a little messy. I have learned that you need at least 50+ images to get a good model. In the future, we are planning to import the scanned objects into FrameVr to help achieve our goals of creating a virtual theatre. This would be very helpful to the theatre group, because they struggle with space. The plan would be to be able to scan objects from the real world to the virtual world so the theatre group can test and see what pieces of furniture and where without having to buy the actual furniture yet.


I have learned how to use a software called Polycam that lets you take 20+ photos of an object and turns it into a 3D model. I started off by scanning a chair and only took 20 photos of it. It sort of worked out. I now know to take 30+ photos.


I’m planning on emailing a guy named Dawson who can help me get a 3D rendering of the shelves. I’ve just been told to email Wills about the 3D model.


All we need to do is get someone who knows how to CAD to get us a 3D model of our shelf idea. I’ve started to learn how to work a 3D modeling software called Onshape. However it is extremely difficult to understand. I’m watching the tutorials in the mean time.


We have found a location for the HTC sensors which to going to be on shelves that we’ll make. Marley has started to learn how to use FrameVr and we got a drawing of what we want our shelf layout to be. We’ve gotten the dimensions that we want for the shelves.


We’ve have come back to Engage and got it working again. We loaded up and classroom and learned how to import items and objects into the room. We’ve located all of the necessary wires to set up the Vive. However we’re struggling to locate a spot for the sensors.


We have successfully set up the Oculus headset and set it all up. We didn’t have much room for the sensors, so we set them up on top of the pc’s on the desk. We were told to learn how to use Engage software and for some reason isn’t working.


We’ve attempted to set up the HTC Vive, but have encountered a major problem. The wire connecting the sensors to the pc is far to short. So we’re setting up the Oculus.

21/08/23 – This is my first day at wisrd.