EMDR Relay 1.1 Released

EVE Market Data Relay (EMDR) has been chugging along behind the scenesin the EVE Online developer community, quietly delivering large volumes of player-supplied market data. But the winds of change are arriving, as CCP has released a set of HTTP APIs for obtaining much of the data directly. EMDR will continue to function for those who don’t want to poll on their own.

The first step in putting a fresh coat of paint on EMDR is to freshen up the relays. We have done just that, updating and modernizing a few things.

Some hilights of the 1.1 release:

  • I have built a Docker image that is now the officially endorsed way to set up and run an EMDR relay.
  • We’ve upgraded (and now require) ZeroMQ 4.x. If you use the Docker image, you don’t need to worry about this.
  • We now auto-restart the process every 12 hours. This works around some ZeroMQ edge cases where connections aren’t restored correctly. Most of our relay operators are already doing this, but now we all will, just in case.

See our Docker Hub repo for full setup instructions. Current relay operators are encouraged to upgrade, though nothing will break if you don’t.

MUD tech is fun/cool, but…

As software development evolves, there are an ever-expanding number ofways to put together very complex, elaborate systems that are fun to geek out on. Multi-processing is becoming increasingly prevalent, distributed systems are a boon to cases with massive scalability or reliability requirements, and there are all kinds of neat data stores available. These are all definitely things that have a place in software.

But what does this mean, in the context of a MUD?

Honestly, very little. Even a very large MUD with hundreds of connected players can be ran on a very pedestrian machine with a modest amount RAM, and very little bandwidth. Multi-threading is not a requirement for performance (and can often work against it). Highly distributed, multi-server setups are hitting nails with jackhammers. After all, we’re talking about a genre that primarily features smaller (<100 connected players) games.

An important thing to keep in mind when developing a MUD is that simple, well-thought-out MUD architectures have a huge advantage over those their more complex kin: They are easier to develop, and they are a lot more likely to ever see the light of day.

MUDs are a labor of love

Although you’ll hear people crying about the demise of the text-based genre, there will always be a niche out there for our kind. However, the vast majority of the games in our space are going to remain non-commercial, and mostly developed by volunteers on their spare time.

By making simplicity a high priority, we make sure that it’s easier to make progress when you do have a few moments to sit down and work on your game. While you could write a super-distributed, super-fault-tolerant MUD server, it’s probably going to make future development more complicated, and you may run out of steam before you get anywhere close to being “done”. As someone who would love to see more great MUDs out there, this makes me sad!

Developing and launching a MUD is a labor of love, and it has to be fun and interesting to you. You have a finite amount of time to get your game launched before you probably lose interest and move on to other things. This “time limit” varies from person to person, and some possess the rare ability to regularly, routinely work on a game for years before going public, but those types are very rare now. Your goal should be to open to the public before your “ok, time to move on” timer goes off. Simplicity is one of your biggest allies while pursuing this goal.

But… there are always caveats

A very valid counter-point to this argument for simplicity is that MUDs are a great way to learn new technologies, to experiment, to do things one might not normally do. If one’s goal is to tinker more so than to actually release a game to the world, you can throw this all out the window. Get as complex/geeky/sexy as you’d like, and have a blast. Who cares if you never ship? That’s not the point for you, anyway.

For those that are most concerned with actually “shipping”

Focus on your core functionality. What is your “minimum viable product?” What is the most direct way to get to your opening day? Avoid unnecessary complexity, and remember that you can always refactor and improve performance/scalability as you grow. Nothing is set in stone.

Avoid traps like multi-threading (unless you really have to have it), super scalability, and elaborate distributed setups unless you just really want to play. You’re designing a Gocart, not an IndyCar.

Simplicity. Clarity. Focus. Oh, and ship the damned thing!

EVE Unified Uploader (market data) open sourced

The author of the EVE Marketeer Uploader has decided to open sourcethe project, which is great news for the EVE community. It appears to have a large amount of traction, this move will guarantee its survival into the future, and allow other people to contribute to the project.

The first few things I can think of tackling are:

  • Packging this up for Mac OS. They’ve got a py2exe’d Windows distro, should be easy to do the same for Mac.
  • Provide clear instructions for running under Linux. This is probably more of a pointing our requirements, rather than packaging.

Kudos to Bart Riepe for opening this up!

More efficient market web APIs for EVE Online

There are a handful of market data sites (EVE-Central, Eve Marketeers,Eve Marketdata) out there now, each with their own developer APIs. All but EVE-Central are relatively new sites, and most seem to suffer from the occasional, or permanent, sluggishness. It doesn’t appear to be for lack of hardware, I know several of these sites are running on some pretty good metal. There appears to be a combination of duties (accepting incoming data, serving the website, serving the developer APIs) that really slows the sites down, along with their respective developer-exposed web APIs. Let’s muse on a cheap, sturdy way to architect around this somewhat.

For the sake of discussion, I’ll just concern myself with the web-exposed developer APIs in this post. These are open-to-the-public services that let other applications grab the market site’s data at will. This is often done with high frequency, and high volume. From what I can tell, some of these sites serve their APIs from their primary app server, which is where the rest of the general web traffic goes through. In some cases, the same process that serves the site and the developer API may also accept incoming market data.

Wouldn’t it be nice if we could shove that API traffic off somewhere else? That would mean sluggishness on the website wouldn’t mean sluggishness in the API, and vice-versa. For market sites with tons of API users, this could free up resources for other things. Let’s see where we can go with this…

The APIs are relatively simple

The good news here is that the APIs on all three sites are relatively simple. To get the price for an item in a certain region, you just pass in the region and the item ID. A “recently updated” query may not require any parameters at all. For the most part, our input is going to be small, and dictated by EVE’s identifiers for various things (items, regions, characters).

Here’s what we know

  • We need a relatively small core set of capabilities for developers to find our service useful. Price by region being the biggest.
  • We don’t want to have to serve the API requests ourselves, since that is boring.
  • We’re probably already doing some processing and aggregation of incoming data.

Here’s what we can do

  • Create a daemon whose only purpose is to accept incoming market data from the uploaders. This gets queued and pumped into another process that does validation, statistical aggregation, and saves such things to a DB.
  • Based on what new data came in, the process uploads JSON documents to Amazon S3, to paths that mimic the current developer APIs (IE: /region/12345/item/123556/).
  • External developers can then form the URLs much like they already do, hitting S3 instead of your servers.
  • Your API is now infinitely scalable, and pretty much impossible to bring down under load.
  • Access can be left public, or restricted by S3 keys or signed URLs.
  • If you are wanting to be self-sufficient, you could even make developers pay for the bandwidth they use with Amazon DevPay. Given that a few other sites offer free APIs, this may no go over well, though.

Caveats

  • You will need to keep track of how long it’s been since various S3 keys were updated. It may not make sense to always upload the new data as it comes in for super active items. You can afford to wait a minute or two between updates for very active item+region combos.
  • Data transfer into S3 is free, but the bandwidth on your market data upload accepter machine may not be.
  • You will be uploading to S3 pretty constantly.

Reference Implementation?

I briefly debated whether to attempt to write a reference implementation for this, but it looks like the state of market uploaders is pretty bad right now. There’s Contribtastic, but it appears to be very much centered on EVE-Central. There’s also a Unified Uploader, but it’s closed-source and Windows-only.

We’ll re-visit this idea if the uploader scene changes.