They all have heatsinks. Unless you mean something other than what you actually said.I do not want a Handheld gaming system that requires a godamn heat sink
Do you mean time it takes to render each frame, the screen's maximum refresh rate, or FPS for each game?I'm not sure if this was answered, but aren't you effectively downsampling to 720p if you use this to transmit from your PC to your TV?
Also, do we know frame display rate?
Could be some latency deal or streaming ability that they've natively built into Kepler. 580s are sitting on really really old Fermi architecture that might be missing some key ingredient. NVIDIA certainly would not want to purposefully lock themselves out of a vast majority of NVIDIA card owners on purpose, nor would they want to proposition these same people with an additional $300-500 purchase to get current on their tech.
And what exactly is a failure for this product?
Selling 10K? 100K? 1M?
Is a failure if it doesn't destroy Nintendo's handheld market share?
Come on, this is just a little gadget for a niche market.
They don't depend on the success of this shield thing to keep alive the company, is just a little experiment, a demonstration of their soc.
It's only goint to be a failure if the user is not happy with the product.
I do not want a Handheld gaming system that requires a godamn heat sink
No, I include the latency twice, once to get the control inputs there and then (combined with the transmission time) to get the encoded frame back.That is just looking at the latency directly to the handheld, not back to the TV, correct?
Yeah, I guess the battery will last much longer when streaming PC games than when playing Android games.Hardware h.264 decoding is very cheap powerwise.
Because I'm an adult in a house with an office where my most powerful computer is.
That would just be from device -> computer -> device, correct? I'm probably missing the key ingredient on how this streams to the TV, but would it not have to go device -> computer -> device -> TV set/receiver?No, I include the latency twice, once to get the control inputs there and then (combined with the transmission time) to get the encoded frame back.
Sure, and then the TV lag on top of that. I didn't include that use case since it doesn't interest meThat would just be from device -> computer -> device, correct? I'm probably missing the key ingredient on how this streams to the TV, but would it not have to go device -> computer -> device -> TV set/receiver?
Are you aware that many console games have 100+ ms input latencies? (And TVs are often around 50, with outliers up to 100+)Fair enough
With render times (in ms) on top of all of this, I can't imagine any game that uses input polling would feel okay using this. Maybe NVIDIA will figure out some crazy voodoo.
Nope. I play on 120hz monitors with 99% frame time in the 6ms range.Are you aware that many console games have 100+ ms input latencies? (And TVs are often around 50, with outliers up to 100+)
I like the sound of it, hate the look of it. Will probably bomb like Ouya and the Steam console.
I'm curious: is it possible to play Nintendo 64 games like Majora's Mask on Android yet? If so, making this a portable NES, SNES, N64, GB, GBC, GBA, Master System, Genesis, and PlayStation would be a very good selling point for the sake of an emulation device. That enough would sell me on this. Streaming PC games is just an added bonus to this.
Sure it's an unethical reason to buy this, but come on, who wouldn't use this as the perfect portable emulation machine?
Are you aware that many console games have 100+ ms input latencies? (And TVs are often around 50, with outliers up to 100+)
The confusion probably stems from their old logo. Which is nVIDIA I guess.It's Nvidia, not nVidia.
Yeah, you can't think of it like a console since it doesn't have to support its own separate software ecosystem.Yes, this will "fail" in the sense that it won't be a mass market product. However, the nature of it being an open platform (and a streaming device for another open platform) assures that this can't fail in the same way that a closed system can. This will be able to play all mobile games released for Android until is out of specs, plus all PC the games. Vita, 3DS and Wii U can't say the same; if those system fail to attain critical mass then their software library suffers a great deal. This device doesn't have that little problem even if it sells just a few thousands units. For me this looks like a win-win.
I guess Wii U games are similar to other console games in terms of internal input latency. The controller streaming is supposed to be very responsive, but I haven't seen any exact measurements so far.How does this compares to Wii U?
AY SOOS!The confusion probably stems from their old logo. Which is nVIDIA I guess.
![]()
At least we know how to pronounce it properly thanks to their stupid title screen video clips, unlike ASUS, which probably the vast majority of people mispronounce.
So we have a normal latency of <= 1ms, with some outliers up to 7 ms. Let's go with a 2 ms average latency L.
For encoding frames, let's go with the x264 estimation (for 1080p!) of 8ms. Let's say the same amount of latency for decoding (though it should be possible to get that lower). So 16 ms codec latency C.
To transmit the data, I'd estimate around 5 ms transmission time T.
With these assumptions/estimations, we get an additional input latency of 2*L + C + T, or 25 ms. That's <2 frames at 60 FPS or <1 frame at 30 FPS.
Hardware support for low-latency encoding is what NV is claiming.
This is some good information, and may go along with what I was beginning to imagine would be the way to make the compression more effective. It would also explain the need for a dedicated video encoder on board the GPU. An uncle of mine was working in the area of video compression at the time OnLive was first announced, and did some guess-timations that showed there had to be something special about the compression they used. Later I think he was told that information was in fact being shared between the GPU and encoder during the rendering of each frame, allowing further optimization of the signal bit-rate. I suspect something similar may be going on here.Digital Foundry has a nice analysis of things (that are known). Some highlights are:
http://www.eurogamer.net/articles/df-hardware-nvidia-project-shield-spec-analysis
This feature is Kepler-only, owing to the onboard h.264 video encoder incorporated into the silicon. Instead of rendering out completed frames to the display, Kepler encodes them at the driver level with no hit to CPU performance and the PC beams out the video feed to Shield.
We have high hopes for Shield's streaming performance considering the overall hardware and software set-up. First up, there's the fact that the device has a 1280x720 screen. PC-side, there's no apparent on-screen rendering meaning that the software can pour all of its resources into a native 720p framebuffer, so even the entry-level GTX 650 should be able to produce decent visuals and frame-rate on virtually every game. Secondly, the resolution limit means that vast amounts of bandwidth won't be required to maintain excellent image quality - 15-20mbps would be lavish by OnLive standards and unattainable on the majority of broadband internet connections, but should be easy to work with on any modern router.
In the home, we should see massive improvements to latency over the OnLive experience too, not just because of the more localised environment, but also owing to the way that the h.264 encoder will have access to the completed framebuffer without having to scan-out from the video output. Provided the panel chosen by Nvidia for the Shield is fast enough, there's every reason to believe that the whole process could be completed with latencies close to that of the average HDTV.
I like the sound of it, hate the look of it. Will probably bomb like Ouya and the Steam console.
Yeah, but those devices don't have the render power of a modern Kepler card at their disposal.All these android consoles look pretty cool. But I'd rather get a Xperia Play 2, or a Google Nexus with physical buttons instead. Nintendo and Sony already make awesome handhelds, and buying another handheld other than my phone, my 3DS and my Vita is a bit overkill.
I'm speculating starting from x264 "zerolatency" mode, which is designed for applications like this. Back at the start of 2010, it was doing 800x600 in 10ms.Could you explain how you reached this estimate for T? This is the one I'm having trouble with. How big exactly is a compressed frame?
H.264 compression, as far as I understand, isn't designed for compressing one frame at a time, instead breaking the video stream into blocks that usually extend across multiple frames, both in the past and in the future, so unless there is considerable change/motion between frames (quick cuts or camera moves), most of the information between frames is the same. Wikipedia claims you can get Mpeg2 quality TV video at 1.5 Mbps with H.264, but surely that doesn't mean an individual frame is 4kB in size(!) for this reason exactly.
Even the numbers you provide for compressing H.264 in real time (8ms for 1080p video) rely on having a pre-recorded video source where you can use the common information between past and future frames to drastically cut back on video bitrate.
If this is basically the wiiu gamepad for PC I'm down as hell
You are mistaken. It streams video (compressed directly on the PC GPU) to the device via WiFi.help me if mistaken
Not that I would ever trust Engadget to correctly assess lag, but it probably can't be too bad.We got our first chance to go hands-on with the device this morning -- our first hands-on with any Tegra 4 device, mind you -- and came away impressed. Beyond being a speedy handheld, the 5-inch LED makes high-def PC games look even more visually stunning. Sheer pixel density alone meant that our test run of Need for Speed: Most Wanted looked even better on Project Shield than it did on the PC running it. More importantly, there was zero perceptible lag.
I'm speculating starting from x264 "zerolatency" mode, which is designed for applications like this. Back at the start of 2010, it was doing 800x600 in 10ms.
As for individual frame size, it will of course be larger than with normal h264 compression. Still, even if we assume an increase in frame size by a factor of 10(!) due to this, that will only be ~ 200 kB per frame. To transmit that in 50 ms we would need 40 Mbit. This is how I got the number.
You are right, I am wrong, and that's embarrassing. So better hope that the encoding does not cause a 10x blowup in frame size, then we're perfectly fineam I wrong in correcting this?
is my math off?
help me if mistaken i read that you can only play steam games, via big picture mode through the tv only, not the screen on the pad! and i think you need a pc to stream the steam client to the nvidia handset wirelessly via wifi
seems a bit poinless i can do that now
Laptop+XBOX Contoller+ STEAM connect to TV via HDMI LEAD
why would i want
NVDIA SHIELD+HDMI LEAD TO TV connected via wifi stream from LAPTOP+STEAM?
The fact it's very much still a prototype is a bit reassuring regarding its, ahum, style of casing.
Ok, I downloaded it. Frame-stepping through it, I see exactly 1 frame of (30 FPS) latency between the monitor and Shield in that video. So that's <= 33ms.Does anyone know how to download the engadget video? it should be possible to get some latency measurement from that, since you see both the monitor and the handheld.
Does anyone know how to download the engadget video? it should be possible to get some latency measurement from that, since you see both the monitor and the handheld.
You are right, I am wrong, and that's embarrassing. So better hope that the encoding does not cause a 10x blowup in frame size, then we're perfectly fine![]()
Ok, I downloaded it. Frame-stepping through it, I see exactly 1 frame of (30 FPS) latency between the monitor and Shield in that video. So that's <= 33ms.
Maybe that should be added to the OP, since it's a real demonstration of the hardware to a third party and not just some marketing number.