Week 18/19/20/21

Update time! So this update may seem brief but that's simply because I've been tied up with some huge projects over the past four weeks and some of them I can't disclose as of yet, but as soon as they go ahead I'll post an update.

Firstly, starting off with a little side project I started to develop for fun. This project is pre-UE4.25 and before Epic released a product config template with Unreal Engine along with the variant editor. I wanted to get to grips with more ray-traced scene's without the need for baking lighting. I used some Arch-vis assets from the UE4 store along with some clean materials from Quixel to construct an 'Ikea-esque' showroom of furniture. The basic functionality of the demo is click on an object in the scene and configure the items to your preference, the pretty part is the data handling on the back end which loads in all the information into a struct through a HTTP request and pulls a .json file from a Wordpress API backend. This means this can be embedded on a web page through pixel streaming and then dynamically pull information contextually from the given webpage. essentially tailoring the experience based on the side. This means if you had a page specifically for minimalist furnishings it would only load plain and simplistic assets, where if it was a page based on maximalism it may change the material to a more pattern and colour based one. This is all about context based data automation which could easily speed up development for assets like this.





Next up, some BIG stuff. That joke might make more sense in the future. Anyways, we're on course at the moment to start to play around with a 'big data' project. I can't really go too much into the 'why' or the expected outcomes of the project but I can disclose the early developments of the idea. Essentially, using existing BIM files of buildings to auto generate level which AI can navigate based on a timetabling system. The idea of pulling in dynamic data from the internet through a CSV or a json file to help point ai to their next destination. because the AI work off a global nav mesh we can dynamically update this mesh in real time to change their flow of movement along with creating rules such as keeping distance from other AI. This can all be used to measure some interesting statistics in terms of how long has an AI been stood idle, or weather un-wanted collisions occur between two simulating bodies.






Finally, my next topic  isn't something I can really 'show'. It's more me personally exploring into unreal's data smith integration along with their data prep assets. Dataprep is the sister to unreal's Datasmith. It allows you to temporarily import large CAD files and creating formulas and blueprints to automate mesh optimisation. This include deleted meshes that are rendered, but contain no material data, or remove meshes with specific tags. For example the BIM model above originally had around 70,000 assets when It was originally imported but after removing assets labelled with the 'ifc.screwsandbolts' tag it remove roughly 30,000 unseen assets as they're simply not needed for this level of visualisation. This is something I will explore more on as I have plans to develop a project with multiple BIM assets and will need as much optimisation as possible.

I guess this is a conclusion for my first semester? Where did I start? I wanted to make a little prototype game for the NHS and build some cool scenes in VR. Where have I come? Well that 'little game' has been nominated for 2 national awards in the UK, and rather than just developing a single room It's expanded to developing 6 working factory digital twins, along with one of those practically being used to plan a major factory facility in the Lancashire area along with 4 other major project bids being placed within the last month. Maybe I'm doing something right?

Comments

Popular Posts