It is often described as the colour of the sky on a clear day
The irony is that it’s also Microsoft’s cloud computing platform and as much as I love the sun beaming down on me and announcing its arrival for spring after a long cold winter, there’s just something about a cloudy day that I appreciate. Is it the toned-down diffused light, (which I think makes for great architectural visuals) or the moody atmosphere? Or the anticipation of rain? Or the fact that it’s the only thing reminding you that you’re 35,000ft high in the air when cruising in a Boeing 777? Or is it rather the idea of a storm approaching with a silver lining at the end? What if all the clouds were cloud 9? In any case, it’s the idea of clouds being interpreted in different ways that I find interesting and you may be wondering what this has got to do with Microsoft Azure other than it being based on the “cloud”.
Layers. It’s all about layers.
Let me explain, right after I got sponsorship, in the form of Azure credits, for my start-up (thedimension.xyz) I went about exploring all the services and tools I could use to elevate the business. What I knew was that it was one of the biggest data servers / hosters in the world, competing with the likes of Amazon’s AWS and Google’s Cloud platform and that you could run dynamic web apps on. What I didn’t expect was just how many components there were. Now, I didn’t study network architecture at university but I did manage to build and run a mini NAS server at home so I felt I knew the basics. Enterprise networks though are on a different level, and I don’t just mean in terms of complexity (although that is also the case) but the sheer number of apps, with billion lines of code between them, working (in most cases) in harmony with billion dollar hardware setups, all powering the pixels, letters and sounds billions of us rely on for modern life. If there was ever a metaphor for “making it rain” then cloud services are not just torrential but also both absorbers and replenishers of the ocean.
It’s easy to want to add every single feature, if cost isn’t a worry, but when you have a set budget, it’s about maximising the investment and catering to the specific needs of your business, which in my case, meant being really picky as to what my use cases for being on the cloud would mean. In my case, I needed to be able to do the following:
- Custom domain with email
- Version control for app dev
- Planning / tracking / organising projects
- AI services integration
- Virtual Machines for hosting/testing in Unreal Engine
- Pixel streaming in UE
This is where time spent researching the documentation and finding what other people have been able to do is a good idea. The term “tech stack” kept cropping up, which basically, is the technologies selected to build and utilise for a certain job. A lot of technologies do very similar things, such as programming languages, but it’s only when you get to understand the ins and outs of a variety of them will you then be able to make an informed decision. Or just pick whichever is most popular, has loads of support, detailed documentation, well designed website and a modern UI. A trial or free tier doesn’t hurt either 🙂
Cirrocumulus
Starting at the broader picture, or at the highest level, the operating system is the gateway and enabler of more specialised apps and I’ve had the pleasure of growing up with Windows since I was tall enough to look out of one. I’ll try not to sound like a Microsoft fanboy but if Windows wasn’t world changing enough, Office certainly changed the game while I was at school, with the likes of Word and Publisher impressing the teachers with professional looking reports. I was never sure of the contents but AI didn’t exist at that point. Fast forward a few years later and here I am, getting my admin email account linked to my domain with Outlook. The business edition of Office 365 was one of the first benefits I redeemed through the Start Up program and knew I would be needing it for my accounting tracker (which I would be using Excel for) and for its integration with the Power suite of apps that I know will come in handy to automate certain processes such as being able to scan an invoice and extract all of its data out to a table.
With my email sorted, I went about finding the best solution for version control that Azure supported and it initially seemed like I had two choices:
- A full stack Azure DevOps toolchain, using a number of native Azure apps, that would certainly work well together as shown in this diagram here that I found in the documentation:

2. Or the GitHub route:

After hours of debating between the two, I found that, while GitHub (also owned by Microsoft) is the most widely used developer platform, it has some limits, even with an enterprise account, when it comes to file sizes. I believe a single file cannot be more than 5GB with the GIT LFS extension enabled which in most cases wouldn’t be a problem, but when I anticipate builds from Unreal being many times larger than this, I ruled it out. However, I did come across a solution called Anchorpoint that is aimed at game dev artists and it too is Git-based but manages to overcome the file size limit which makes it compatible with the generous 100TB storage limit on Azure file servers. This sounded perfect for my needs until I found out that the free tier only supports hosting on GitHub. So I had to rule it out too.
This diagram outlines how I wish to use the data in the cloud and my plan:
- P4 (Helix Core) for version control of Unreal projects
- GitHub for smaller projects / tools and experimenting
- Azure Boards for documenting and planning

Perforce P4 (Helix Core) has been the game development version control industry standard for many years now and once I found out that they had created a quick start template for Azure, I was sold. I followed this guide but I did have a lot of issues with the deployment at first. The Swarm server (code review app) didn’t seem to want to install so I ended up deleting the whole resource which led me tearing my hair out when trying to delete a protected instance of a backup folder. The issue was that I thought the drive was empty but I didn’t realise one of the filters had hidden the file from view. The next problem to solve was connecting to the Linux VM, as this didn’t support remote desktop (with a visual screen share), I had to use the terminal to edit a file on the machine and for some reason my SSH key was not working. A restart, as always, did the trick and I was then able to log in but then came the issue of the P4 typemap file saying I input the wrong syntax, even though I literally copied and pasted from their own documentation. I was about to give up at this point but luckily tried another typemap based on Unreal’s recommendation and it worked! It was just the spacing of the code that was the issue. After that hiccup, it was smooth sailing and I got it synced with my local machine with some test projects. Though the UI isn’t as user friendly as Anchorpoints, the feature set, such as the branching (stream) options seen here got me excited to start developing.
Part II: Altocumulus will be next up and where I will be focusing primarily on the coding side of things with my views on Visual Studio (as the code editor of choice), GitHub CoPilot (now being an invaluable resource to help code) and AI integration (using it to recognise facades and cropping them into textures).
Stay tuned.
Leave a Reply