The problem

Data storage is not as expensive as it used to be. However, storing large amounts of data in a secure and reliable way still has a sustainable price tag. When we look at cloud services that store petabytes of data you can say that this is still a costly operation.

3D models use far more data storage then an average webpage. For all models the geometry, appearance, scene and animation information must be stored. Depending on the file format this can be a few hundreds of Kilobytes in a compressed format for a simple model to tens of Megabytes for more complex structures and higher detailed models. It is advised to optimize models for web distribution so the amount of data will be minimized. This will not only save on storage capacity but the data also has to travel through the network to the client that will interact with the models in their browser.

Storing the data for a single Cosmos planet or VR application containing multiple models is still manageable and can be done on a single server. Like running a game on your own computer the average available hardware has no problems storing the models needed, even for larger applications. With hard disks of some Terabytes the data for multiple games or planets can be stored easily. But a single server hosting the data creates a risk for failure. When this server goes offline the application is offline as well.

If you run a website on a dedicated server, a virtual private server or cloud server it has at least tens of Gigabytes of storage available, even when using fast SSD’s. This is often far more than needed and most of the available storage remains unused. When developing a 3D application for the web more storage is used but still not the amount available on the average hosting. By utilizing the available storage as cache for other applications, the whole system will benefit from the available resources. It makes the data available on more locations and by storing it multiple times it serves as a backup system as well.

The purpose of a private server or node

The data for the online applications must be stored somewhere on the internet. This will be one of the tasks of a private server. It stores the data similar to how a web server for a website would. Besides storing the data from the owners’ application it will also use the available storage to cache data from other applications on the platform. This will utilize the available storage in the network better than web servers currently do now. A lot of storage space is unused because web servers are often a dedicated host for a single application or website. The allocated storage for the host can not be used by anything else.

Servers or nodes will distribute the data to each other as seeders using a bittorrent protocol. This will duplicate the data on multiple nodes to make it redundant and available even when the original host is not online. But it will also make the system faster because more nodes can serve the data to the clients. Because not all data can be stored on a single server it will only store data that is used by clients in the neighbourhood like a caching system.

When a client demands data for a particular model it uses a magnet link to start downloading the data. The server that receives the request for a piece of data will immediately look for the data in the torrent network and start downloading it as well. All nodes that have the data available will provide it and by requesting the data, it will be duplicated on a new node.

Clients can obtain the data needed for landparts and surrounding land or a planet from any server but will receive it from the fastest available in their neighbourhood. When there is more need for a particular piece of data, because more users are on the same location or planet, more nodes will have it in cache and as a result it becomes more distributed and faster available to the clients.

Another advantage of this torrent mesh network is that failure of a particular node will not affect the system. Since all the data is available on multiple servers it can be delivered from other nodes and as all servers can perform the same tasks it doesn’t matter which one is actually doing the tasks.

Besides storing and distributing data, the server nodes will also perform other tasks (discussed in another post on the Cosmos platform architecture). This makes the whole platform distributed and redundant. If more people participate in the network, it will perform better and it will become more reliable.

The advantage of hosting your own data

Having a private server node online ensures the data of the owners landpart and planet to be available in the network but it will also participate in the performance of the platform. The private server primarily hosts the data for the wallet owner that is running the node, but it also caches data that is used by clients nearby.

Hosting your own data is not mandatory

So, running a server node to participate in the system gives some advantages and will be the backbone for the whole platform. However, to deploy your application it is not mandatory to have your own server. You can create and deploy your creations to the network from your own computer and the data will be cached in the network as well. It is however no guarantee that the data stays in cache when there are no users requiring the data.

Accessing the data

To access data for models in the platform within the browser you need to know what parts of data you need and where to find it. This information is stored in a blockchain. Your location in the virtual world determines the landpart you are on. Via this landpart token, all tokens of the models can be retrieved. The model tokens store the data location as magnet links. With the magnet links the data can be downloaded from the torrent network.


The storage requirements for the 3D virtual world are higher than traditional websites. Storing large amounts of data in a secure way is still expensive. The solution is to distribute the storage in smaller amounts over multiple nodes.

To make the storage of the data distributed, redundant and widely available we rely on the use of private servers (nodes). The servers will be seeders of data in a torrent network that will store the data of the owner but also cache data needed by clients nearby. By caching the data, it will be duplicated and so it will serve as a backup system as well. It also becomes available from multiple nodes so it will improve the performance of the platform.

When a client wants to access data, the nearby servers will also download the data and therefore duplicate it. By duplicating the data it will also become available on more nodes. This will immediately speed up the download process.

Casper Karreman

Casper is a 42 years old software engineer from the Netherlands, working as software developer for over 18 years but codes for almost 30 years. Starting out as a hobby at the age of 12 he grew out to a full stack multidisciplinary DevOps professional. Recently adding machine learning and blockchain development to his portfolio. When he's not behind a computer he is probably rock climbing, mountain biking or snowboarding.