100GB Game Downloads Aren't the Problem, Predatory Data Caps Are


#1

The leap to 4K is happening as internet service providers are quietly restricting what customers can download.


This is a companion discussion topic for the original entry at https://waypoint.vice.com/en_us/article/a3z5db/100gb-game-downloads-arent-the-problem-predatory-data-caps-are

#2

Predatory data caps definitely are a problem. I think the ever-higher-resolution train is going to run into an infrastructural problem that will force it to be kept at a normal-ish level or ratcheted down. There is already a base of people who can’t easily access digital games because of poor network infrastructure in their area or region, but data caps are likely to expand that.

(I am also not a network engineer! If someone with more knowledge wants to chime in to lay out reasons beyond “telecoms make more money”, I’d be happy to hear them.)


#3

I really wish games provided download options. Even with fast internet (and plenty of places are on ADSL not fibre still - not fast when it comes to 100GB downloads) then I don’t need the assets for Doom’s multiplayer and definitely don’t need the added content in the huge MP patches. I’ve got 8GB of VRAM and a 4K screen but sometimes I’m just taking a glance at a game and I don’t need uncompressed, full res textures for the entire world. I never need uncompressed audio, give me the compressed stuff and I’m fine (I care way more about good positional audio using my 5.1 than avoiding the slight degradation from lossless compression or no compression [yuk - compress that in transit/pak files friends, even if the game engine wants uncompressed audio for very short clips]). Often Steam games do that via the top texture tier (or the uncompressed versions that require more VRAM than the DXTC variants) being a free DLC.

But this could certainly be done more often and with developers considerate of optimising for bandwidth requirements (again, if you’ve got 70Mbps+ fibre then that’s cool, me too, but that’s really not universally available). If you’re just automatically generating those mipmaps then you don’t need to send them down the connection, just generate them at install time from the highest detail version. Not everything can be built with the compactness of a demo (as in demoscene) but there is certainly a focus that could be brought to AAA development that considers this reality of modern internet use.


#4

Yeah, I don’t know, the benefits of 4K downsampling seem pretty thin to justify that as a default for XOX (am I getting even 2x color depth improvement for 4x the data?). The cynical part of me sees the 4K push as another attempt by content companies to thwart Big Internet, like they tried with 3D.


#5

For raw texture then 4K render and downscale is a wasted op. If you load a more detailed texture and sample 4 pixel locations that end up in the final render, then downscale to 1080p by merging those 4 pixels then it generates the same average that the next mipmap pre-computed (because each tier in the texture is quarter-area/half-size and so does just pre-compute the averages of 4 texels [texture pixels]).

Where it does have value is at the edges of polygons, where you’re not just getting the same as a mipmap, because it’s averaging over several different textures. So it’s a slight improvement (but mainly the visual benefit is super-sampling and avoiding aliasing). One of the other benefits is from shader aliasing. No longer are game engines just looking up texels and throwing them into the render buffer, they’re running complex lighting code and looking up several textures (shininess, colour, normal, smoothness) to calculate the resultant pixel value. Sometimes this leads to discontinuities in neighbouring pixels, creating aliasing inside a single polygon area. This aliasing is also improved by supersampling, just as polygon edges are. So it’s outdated to purely consider mipmaps as the complete solution to maximising quality inside a single polygon boundary (unless the shaders have all been constructed to avoid aliasing). Although how much that benefits from higher detailed textures is an open question (note that often some of the textures are lower res then others and there it does help to get more detail) so it could be that 4K rendering without “4K textures” would be just as effective in solving the shader aliasing issues.

A note: you are already used to massive supersampling to avoid aliasing. Whenever you watch CGI/modern animated movies then you’re enjoying the precision of rendering that brute-forces around aliasing issues. When you brought a Pixar movie on DVD you saw the benefit of rendering at a resolution far beyond the SD output that the DVD contained. Rendering internally at 4K for a 1080p TV is a proven upgrade, we know because we’ve been experiencing the benefits in offline-rendered CG content for decades.


#6

I don’t have a data cap anymore, but what I do have is 300 kbps download speed. You can imagine my excitement when I bought Doom on sale only to find out it’s nearly 70 GBs compressed. To make matters worse, I have a 250 GB SSD, 30 of which is used by Windows, and Doom will take up 100. It’s like, frustrating y’all.


#7

Last I heard, US internet is terrible all round so download management tools really should be the norm (well, not-terrible net should be as well but games have gotta take consumer reality into account [inc. the weird ass storage space choices they themselves have made]).


#8

Really nice to see this perspective on the somewhat recurrent download size issue, which I think too often is pushed onto the developers, download services, and so on rather than what I usually think of as the real culprit: Internet providers abusing their local monopolies to artificially cripple access. For as much as entertainment companies (beyond just games) are trying to force an increasingly digital future to happen, it can only increase disparities in distribution until the fundamental problem of giving people cheap, widely available Internet is solved once and for all. Unfortunately I have every reason to suspect the problem is only going to get much, much worse, but coverage like this is a step in the right direction.


#9

Internet access should be considered a public utility but in a country where Flint can’t even get clean water still…


#10

I would still say a 100GB download is a lot for our current affordable Harddirves. That said the worst part is the data cap as it a constant and everything you do on the internet is hitting at that cap limit.
I’m lucky to be in a area with no data cap but hearing how many places are getting it is making me worry.


#11

Yeah like. The data caps are PART of the problem but I think acting like massive downloads would be fine if they didn’t exist is kinda off base.


#12

I don’t exactly disagree, but as somebody who went into a data cap with eyes open because that’s just how they sell internet up here in Canada, and I don’t love it but I mostly accept it as a fact of life in a weird large country with low population density, automatic 100gb downloads are still a problem.


#13

I did a lot of grad work in networking, so I can give light to why datacaps are being introduced. With the introduction of stream services (like Netflix, Hulu, Crunchy Roll) network utilization went a lot higher than most ISP were expecting. Most of them didn’t want to upgrade there infrastructure. Most ISPs don’t want to upgrade there stuff unless there is competition to do so, and they want to make more money. So on a service side they want to cap how much people are using the internet, so that their network utilization won’t go down, and they don’t want other ISP routing traffic through their networks. By using these caps they can charge people because I doubt 5 years ago anyone thought a 1TB worth download traffic would be a realistic number to download. Or it was a reality they didn’t want to support. Now that these numbers are becoming a reality, it is just a new way to charge customers. This was one of the reason google came out with google fiber to create competition so other ISP would upgrade their infrastructure.


#14

I’ve yet to reach any kind of data cap with my ISP and I still think 100gb game downloads are a problem. HDD space doesn’t grow on trees (a decent-quality terabyte HDD is what, $70? things get even more insane if you want an SSD) and even having a 40 meg connection it’s still faster to drive to a store and pick up a physical disc, even counting install times.

These aren’t the kinds of problems I should have to think about. As all of the other problems in gaming slowly fade away, where to store all of this crap seems to not only be a constant threat, but a GROWING threat.


#15

My ISP forces you to buy into a higher data cap if you exceed your cap three times in a twelve month period (no change to speed). I’ve gone from a 400GB cap (200mbps) to a 1.1TB cap in the last year and a half, and my internet bill has gone from $60 to $150 because of it. I still average around 900GB a month. It’s not fun.


#16

I pay about that for about 10mbps…


#17

Man, I wish I had these problems haha.

2 days ago I got a message from my ISP saying I was getting up there in my data usage, so I needed to slow down. We’d used 280GB of the 350 we get a month. Our actual download speeds peak at 2mbps. We pay $100 a month for this service. It is one of the best options available for us. The rollout of the NBN (a fast speed fibre network country wide here) is starting to see higher speeds (a friend can achieve 40mbps with this) and caps (“unlimited” internet is essentially unheard of, though we are starting to get some options filter through) but that won’t be "finished’ for another decade.

Straya y’all.


#18

well hard drives aren’t cheap, I’m definitely double checking if the size of a game is too big even if I now have unlimited fiber. 100gb would be a big nono


#19

Also with the hard drive issue, I’m much more likely to uninstall and throw away a game that takes up too much space than I am a smaller indie I still want to return to playing.


#20

When people are giving prices, are these just for the broadband service or bundled with other parts?

Here you can get various consumer broadband via the local copper loop system (ADSL to the exchange or VDSL to the cabinet on the street then fibre to the backbone) or via the first fibre network (which was created independently starting 20 years ago and slowly upgraded; it is also only fibre to the curb but they put in their own final stage rather than using the existing copper loop at all). They all offer bundling with TV services and landline phones that cut down the individual prices but typically, even if you only want internet, you pay a line rental on the local loop for the first option (which has mandated competition so all competitors can offer services over that copper/fibre and take control at the exchange).

You’re looking at about $20/month for the line rental and that often comes with free evening and weekend calls and reasonably priced calls at other times with a landline included. On top of that I’m currently paying $20/month for 80/20Mbps down/up VDSL/fibre with no official cap (but, as I said further up, we often don’t get hard caps but rather the ISP reserves the right to use traffic shaping and soft-caps that drop you down to, for example, 25% of the speed you normally get if you use too much during peak hours). I typically use between 600GB and 1.5TB a month, depending how much I’m streaming (HD/4K) video or downloading big games. I’ve never seen my connection getting soft-capped with my current provider.