You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If your server has enough RAM to store and query a graph with 5 billion entities, you should not have an issue running the bulk loader. It will automatically divide your input into batches to populate a buffer of up to 2 gigabytes, and maintains a dictionary mapping all nodes to their identifiers.
I'd expect this to take dozens of hours, but there are too many factors in play to be very precise. Generally, load time will scale linearly with the input size. Building a graph with about 5 million nodes, 5 million edges, and 20 million properties on my system takes 220 seconds, so increasing that by a factor of 500 gives about 30 hours as a very very rough estimate.
Activity
jeffreylovitz commentedon Aug 10, 2020
Hi @uriva,