Hilo oficial de la guerra PC vs Consolas

Hoy toca overclocking de monitores, 4K, y GDDR6:
Week In Tech: Overclock Your Monitor With NVIDIA | Rock, Paper, Shotgun
Week In Tech: Overclock Your Monitor With NVIDIA

By Jeremy Laird on March 4th, 2013 at 8 :00 pm.

Monitor%20overclock.jpg


A high quality LCD panel. Or high refresh rates. Take your pick. Because you can’t have both. Well, not unless you think BadgerJump Monitors (or whatever they’re called) sounds like a sensible consumer electronics brand and you’re thus willing to roll the dice on a dodgy Korean panel off eBay. But wait. One of the footnotes to NVIDIA’s recent Titan graphics card launch is a new monitor overclocking antiestéticature. Yup, monitor overclocking. But will it give you 120Hz for free? Will it fry your panel? Do you need NVIDIA’s £800 Titan? Should you actually care about high refresh? I’ve got the answers…

First up, if you’re not interested in chewing over the broader subject of high refresh and you’re keen to overclock the twangers off your monitor immediately, skip to the last third and the ‘hands on’ bit. There, I explain what you need, where to download it and my experiences so far. The simple version is that if you’ve any kind of NVIDIA graphics card, you’ll probably be able to have a go.

Anywho, monitor refresh rates. They’ve always been a bit of a brain ache. Back in ye olde days of CRTs, you had to play off resolution and refresh. The higher the resolution, the lower the refresh was the general rule.

CRTs generate an image line by line, of course. So higher refresh makes for a more stable, less flickery image. Easier on the eye. Less chance of a headache. Plenty of obvious benefits.

Flat%20CRT2.jpg

Remember when flat CRTs were the bomb?

With LCD panels, none of that applies. In really simple terms, you can think of an LCD panel as being always on. The image isn’t generated line by line. Instead, every pixel is simply updated at a given frequency or refresh rate. Even if you reduced the refresh rate to 1Hz, there would be no flicker. You’d just have seriously crappy frame rates.

In truth, it doesn’t work quite like that. But that’s close enough to the reality for argument’s sake. Anyway, the point is that flicker isn’t an issue on LCDs. But frame rates are. It’s at this point that the science of human sight enters the equation and I have to admit my limitations. Or at least frustrations. It’s a subject about which I’ve always found the science a little unsatisfactory.

To get an idea of what I’m talking about, let’s have a think about the various video formats around today. Take movies, for instance. Until recently, the standard frame rate (or effectively refresh rate, though actual projection rate or shutter speeds vary) for a movie was 24 frames per second. That’s enough, I think you’d agree, for what looks like smooth, natural motion when you’re plugged into a pew at the cinema.

However, if you’ve suffered through the impenetrable tedium that is The Hobbit in High Frame Rate (HFR) format, you’ll know that doubling the frame rate to 48 frames per second makes an enormous difference to the look and feel of motion. One can argue the toss over the question of whether HFR looks better. But clearly the human eye and mind can distinguish 24fps from 48fps.

hobbit.jpg

Bloody awful: The Hobbit in HFR. Not sure about the frame rate, either.

Now, consider that the standard refresh rate for a flat panel PC monitor is 60Hz or effectively 60 frames per second. Significantly higher than the new HFR movie format, then. And you might think high enough for completely fluid motion.

That’s pretty much what I thought until fairly recently. I used to assume the only benefit to rendering games above 60fps was that it gave you more headroom for those occasion frame rate troughs. Rendering well above 60fps on average, in other words, makes it less likely you’ll drop significantly below 60fps at any given moment.

Given that all LCD monitors were limited to 60Hz, that was literally true. But I didn’t give any credence to the idea of an upside to a monitor capable of higher refresh rates.

Then NVIDIA rolled out its 3D Vision tech, requiring monitors capable of delivering 60Hz to each eye and thus 120Hz overall. And I had an immediate epiphany. Getting a precise handle on exactly where the threshold is for the human eye and brain in terms of frame rates is tricky. No doubt it varies, anyway. We’re analogue beasts, not digital droids.

But I can tell you this for sure. Even just pushing windows around on the desktop, the difference between 60Hz and 120Hz is obvious.

3d%20vision.jpg

3D gaming according to NVIDIA. Uh huh.

That said, I’m not sure I can tell the difference between 100Hz and 120Hz. A further complicating factor is motion blur. It’s this that allows a 24fps movie to seem smooth, which on the face of it doesn’t make much sense in the context of our ability to distinguish 60fps from 100fps.

Anyway, the most important point is that on the desktop, in games – bascially, on your PC – 100+Hz is bloody lovely. I can’t stress that enough. High refresh rates make your PC feel much more responsive and look much slicker. To coin an Alan Dexter-ism (you know who you are!), high refresh makes all your games look awesome. The only snag is that you’ll need a video board that can feed all those frames.

Well, that and the fact that, currently, high refresh rate monitors are limited to TN-type panel technology. Yes, a boutique industry has popped up involving 27-inch IPS Korean monitors capable of 100+Hz. But in the real world, high refresh monitors are TN. And TN has the worst image quality by every metric save response times.

That it’s the quickest panel type makes it a natural fit with high refresh rates. But the latest IPS panels are pretty nippy, too. And high end HDTVs now typically offer 100Hz and beyond refresh rates but do not use TN panels.

The hands-on bit:

Overall, I doubt there’s any good technical reason why you can’t buy an IPS 120Hz screen. It’s just none of the big boys have had the balls to try it, so far. But can you make your own? Now, that’s an intriguing question.

When I first heard about NVIDIA’s monitor overclocking, it was supposedly limited to the new Titan board and was thus irrelevant. But no. It works with a broad range of NVIDIA graphics cards. I’ve tested a current GTX 680 and an ancient 7900 GTX.

geforce%20titan.jpg

Monitor overclocking arrives with Titan. Works with older NVIDIA boards, too.

The latter dates from 2006 and works fine with the monitor overclocking tool. So, I’m going to assume anything newer will be dandy. Software wise, NVIDIA isn’t providing the overclocking tool directly. It comes from board partners. I’ve been using the EVGA Pixel clock tool. It works with non-EVGA cards. You can download it here.

For the record, I tested it with an AMD graphics board and no dice. It simply throws up an error decrying a missing NVIDIA API.

The real question is monitor compatibility. I’ve tested four monitors with variable results. Most disappointing are my Dell 3007WFP and 3007WFP-HC. Neither would run even 1Hz higher than 60Hz. Bummer.

Next up is my Samsung XL30. That will run up to 72Hz, but behaves oddly at that refresh. The fastest stable refresh it supports is 70Hz.

In many ways, the most interesting test subject is my Dell U2711. That’s a 27-incher with a modern IPS panel and lots of inputs. It’s exactly the sort monitor I’d want to be overclocking for games.

evga.jpg

EVGA’s tool is pretty much idiot proof.

Unfortunately, I found it essentially doesn’t overclock at all. I tested up to 80Hz and it will render an image. But at any setting above 60Hz, the frame rate is jerky and stuttery. Something odd is going on with the image processing.

If that’s disappointing, what’s interesting is that I reckon I can feel the difference at 70Hz on the XL30. It’s noticeable smoother. Reading around, it looks like 85Hz or thereabouts is probably where the maximum subjective smoothness kicks in, so you don’t need to achieve 100+Hz to get a tangible benefit from monitor overclocking.

The proviso to all this is involves the unknown risk to your hardware. My understanding is that it’s actually pretty safe. But the usual small print applies. Move up in very small steps. And it’s all at your own risk.

Still, something for nothing is always nice. I’ll probably be running my XL30 at 70Hz from here on in.

I’d also be very interested to hear how any of you guys get on overclocking your panels. Good luck!

The State of 4K and 4K Gaming: Early 2013 - Bright Side Of News*
The State of 4K and 4K Gaming: Early 2013
3/4/2013 by: Anshel Sag

Since our last article, we have been able to peer a little deeper into the state of 4K. There’s a lot of talk about 4K now that most of the major TV vendors have shown off the displays that they have planned for 2013 and beyond. We have seen displays from Viewsonic and Sharp as well as TVs from LG and Sony. Some of these displays are slowly starting to leak out into the market, but in extremely limited quantities and with some incredibly high price tags. LG’s currently ‘available’ 4K TV is a gargantuan 84” and sports a single HDMI 1.4a connection rather than two dual-link DVI.

Currently, the only displays that you can really get your hands on are LG’s 84” TV which antiestéticatures a limited refresh rate of 30Hz due to being run off of HDMI 1.4a (only supports 4096x2160 at 24Hz). Until HDMI 2.0 comes out and becomes a standard, you won’t really be using an HDMI-based 4K display for much of anything other than watching TV. However, there are displays that can do 4K that is capable of 60Hz and beyond. We currently have an EIZO FDH3601 display, which is powered by two dual-link DVI cables that each drive one half of the display.

4KBench_689.jpg

Our 4K testing setup on the right with the EIZO 4K monitor

Even though this display also has DisplayPort 1.1 connectors, you need to have DisplayPort 1.2 (capable of 4K at 60Hz) in order to be able to drive a 4K display with a single cable. Therein lies our biggest problem, right now, those devices simply aren’t available to the mass market. At CES, we spoke with Viewsonic and Sharp about their 4K displays that would antiestéticature a single DisplayPort 1.2 connector and we had Sharp telling us to expect their 4K displays in late 1Q and early 2Q 2013. Viewsonic gave us a slightly later expectation, with no concrete date of actual launch.

DSC_0988_689.jpg


GPU Support of 4K

The reason why we’re talking about displays is because we’ve already had a chance to test both AMD’s and Nvidia’s 4K support with our initial article, however, running a dual-link DVI display with these graphics cards has been a bit difficult. The EIZO FDH3601 is recognized by windows and the graphics card drivers as two displays because each half of the display is driven by one of the dual-link DVI cables. This can make for some fun gaming scenarios to say the least.

Nvidia’s graphics drivers are generally designed for display configurations of 1 or 3 displays, having two displays does not really work very well as most games try to maximize themselves into one half of the display or the other. As such, we could not enable Nvidia’s surround antiestéticature which turns multiple monitors into one desktop. Because of this limitation, we could only run applications windowed and then maximize them manually or using software. Running applications like Adobe Photoshop CS6 and Premiere Pro were no issue; however, they still required maximizing. With Battlefield 3, we were able to get the GTX Titan to display in 4K very well, but once we enabled SLI we had issues with one half of the display running 30Hz and the other in 60Hz. Nvidia’s Tegra 4 also claims 4K support over HDMI 1.4a, which would limit you to 30Hz, effectively being for video playback only.

AMD’s graphics drivers are unsurprisingly more flexible when it comes to multiple displays as they were the creators of Eyefinity, which originally drove Nvidia to develop their surround technology. Eyefinity enables up-to 6 displays for a single graphics card, with up to 3 being available to almost all non-specialized graphics cards. What’s good about AMD’s implementation is that it is first and foremost incredibly easy to use, and does almost all of the work for you. It does not discriminate between 1, 2 or 3 displays or how they are set up. For this, we give AMD kudos. However, their drivers suffer from the same problem that Nvidia’s do when running multiple GPUs. We’re not entirely sure if this is because of the type of monitor or because of the way the graphics are handled in multiple GPU configurations. But both AMD and Nvidia have the same issue when running multi-GPU on the EIZO display.

To remedy this issue for gamers and people doing 3D modeling, users will need to obtain DisplayPort 1.2 capable displays coming from Sharp and Viewsonic. We’ve also heard murmurings that some US display manufacturers may be getting Sharp displays and rebranding them as their own. Nevertheless, in order to really be able to fairly measure game and application performance across all platforms, DisplayPort 1.2 is necessary since it delivers 4096x2160 at 60Hz. We will have to wait for these displays to arrive to the market for us to be able to accurately measure 4K performance between Nvidia, AMD and Intel (and perhaps even Qualcomm).

Intel’s current supported 4K solution (dual Thunderbolt cables) is not really intended for anything more than desktop display and 2D applications. Their GPUs simply aren’t powerful enough. With Haswell, Intel should get a significant performance boost, but don’t expect more than video playback and 2D performance for quite some time. Intel will likely support a single cable solution with DisplayPort 1.2 enabled displays which should make 4K more accessible to those that don’t want to spend money on a dedicated GPU for video playback and 4K in 2D.

Qualcomm’s current 4K support is primarily being touted in their Snapdragon 800 processor, which was shown at this year’s CES playing back a 4K video. Considering that the Snapdragon 800 likely doesn’t have a DisplayPort 1.2 connector support or HDMI 2.0 support, it will, like the Tegra 4, be limited to 4K video playback at 24Hz. Unless, it runs at 3840x2160, in which case HDMI can be driven up to 30Hz.

While little has been said about Imagination Technologies’ mobile GPUs, they have found themselves embedded into some of the 4K displays. Imagination Technologies PowerVR graphics IP has found its way into LG’s latest 4K displays as they showed us back at this year’s CES 2013. We’re still waiting to hear more from them about mobile and 4K support, but considering that their latest PowerVR 6 series graphics already supports 4K, we expect an official mobile announcement not to be far behind.

4K Gaming (Performance and support)

When it comes to 4K gaming, we found ourselves a bit limited by the display that we were using as it isn’t necessarily intended for gaming. The EIZO FDH3601 is really designed for more 2D applications with some 3D applications being possible with the right professional graphics solutions from Nvidia and AMD.

Nevertheless, we were able to get trusty ole BF3 to run on both graphics solutions and we’ve got benchmarks. We compared Nvidia’s latest and greatest single GPU against AMD’s latest and greatest single GPU. Since we don’t have an HD7990 or an ARES card, we’ll have to suffice with two HD 7970 GHz PCS+ edition cards from Powercolor.

BF34K_2_689.jpg


Here we have our Battlefield 3 results. As you can see, one GTX Titan is basically as fast as two HD 7970 GHz edition cards. While we were unable to run two Titans successfully due to displays scaling issues in SLI (likely due to the type of display being run), there is no doubt that the Titan is the best single GPU in the world for 4K gaming, two HD 7979 GHz edition GPUs do deliver the exact same gaming experience at a lower cost, however their power consumption and heat generation is significantly greater. If you want the most elegant solution, the GTX Titan is the best GPU, but if you want the best bang for your buck, AMD's HD 7970 GHz edition has you covered.

After playing some Battlefield 3, we decided to see how other games panned out. Unfortunately for us, most games would not cooperate with the GTX Titans because of the way the drivers are for consumer cards. As such, most games would only run on one half of the display as we mentioned earlier. So, for the rest of our gaming benchmarks, we ran two AMD Radeon HD 7970 GHz Edition graphics cards. First, we tried our hand at Crysis 3 to see if two of AMD's cards could, in fact, play Crysis 3 in 4K.

Crysis3in4K_689.jpg


Looking at our results, we were unable to run two HD 7970 GHz edition cards in CrossFire smoothly at maximum settings. It unfortunately requires three HD 7970s or two HD 7990s. In order to be able to play Crysis, we had to dial down the preset graphical settings a bit to high from very high.

4KGamePerformance_689.jpg


As you can see above, we could get most of the games playing at their highest settings, however Batman Arkham City in 4K simply was not playable at Ultra settings, so I had to dial them down to very high settings. The game still looked absolutely stunning, but not as good as it could look. Borderlands 2 and Skyrim both ran very smoothly and quickly as you can see above. Skyrim definitely ran the fastest, but that is without the high resolution texture pack and I have to be honest, Skyrim definitely looked the worst of all the games. If you're going to play Skyrim in 4K you're going to want to get the high resolution texture packs.

Even with AMD’s Eyefinity enabled, we still ran into some scaling issues with games like FarCry3, Max Payne 3 and Counter Strike: Global Offensive either being unplayable or only playing on half of the screen. While we’re not entirely sure whether it was the graphics drivers or the games themselves, we really hope that game developers are preparing to support 4K resolutions, like it or not.

We are hoping to get our hands on some of the newest DisplayPort 1.2 displays very soon and will do a proper 4K gaming comparison when they arrive.

4K Media

In addition to 4K displays and 4K gaming, there is also a huge thirst for 4K video content to drive these new displays. Currently, there are very few 4K cameras out there, with most of them being 4K cinema cameras. There are, however, bright spots in the quest for 4K content with companies like GoPro supporting 4K video at 15 frames per second with their GoPro Hero 3 Black Edition. While 4K at 15 FPS is not really a good video frame rate, it is now only a matter of time until the next GoPro Hero supports 4K at 24 and 30 FPS.

JVC also has a $4,999 handheld camera, the GY-HMQ10U, which actually does record 4K video at 3830x2160 (QuadHD) at 24 FPS, 50 FPS and 60 FPS. This is still technically a professional camera, however, at $5,000 it is significantly cheaper than almost all of its competitors. The next best affordable 4K camera comes from Sony For $8000, almost double the price.

There is no doubt that 4K content is slowly catching up from where it used to be with only Sony and Panavision making incredibly expensive 4K cameras. Part of that, I believe, we have to thank RED Digital Cinema for having helped drive down the cost of 4K.
[YOUTUBE]lXY8Szhz42M[/YOUTUBE]

Now that cameras are beginning to come down in price and improve in specifications, we’re also starting to see codecs come along as well. The latest codec, which was recently developed and ratified earlier this year, was HEVC (High Efficiency Video Codec) also known at H.265. HEVC will replace AVC or H.264 over time and is expected to deliver reduced bandwidth by about 50% while maintaining the same level or better quality as H.264.

This new codec will find its way into hardware and software once the movie industry decides how they plan to use the new codec in their content sometime later this year. We recently wrote an article about HEVC where NTT DoCoMo demonstrated 4K running at about 10Mbps, which is significantly lower than anything anyone would’ve expected. Furthermore, 10Mbps allows for the streaming of 4K over the internet and can enable instant delivery of content to new 4K TV owners.

So, it appears that 4K is well on its way to becoming the next HD standard and that 2013 is definitely the year where that fact becomes concrete. In the past, we were sold technologies like 3D while 4K was already possible and the market pushed back. 4K promises to deliver four times as many pixels as 1080P while remaining on the same screen size and giving us comparable visual quality that we already expect to see on our phones. While I can’t necessarily say that 2013 will be the year of 4K with so many questions still in the process of being answered, it is safe to say that 2013 will be a defining year for 4K and HD video as a whole.

Y GDDR6 para el año que viene:
AMD Kaveri Unveiled: PC Architecture Gets GDDR5 - Bright Side Of News*
AMD Kaveri Unveiled: PC Architecture Gets GDDR5
3/5/2013 by: Marcus Pollice

We were able to take a peek at AMD NDA information (aimed at engineers) that details the technical antiestéticatures of the Kaveri APU. According to this information, Kaveri antiestéticatures a GDDR5 memory interface consisting of four 32-bit memory channels. This is perfectly matching the width of a GDDR5 chip which is also 32-bit. However, the memory controller has to be set up in a way so that two 32-bit channels work in tandem, half-channel use is not supported. The total width is 128-bit, so the main advantage comes from higher clock speeds of GDDR5 memory. This is in addition to the 128-bit DDR3 interface that we already know from previous APUs. Usage of DDR3 and GDDR5 is mutually exclusive.

While the information we glanced at is clearly preliminary and could be changed till the release of the chip, the focus should be less expensive GDDR5 chips with moderate clock speeds – not the 6GHz hotness you find in high-end graphics cards like GeForce GTX Titan. Specifically the document lists 800 MHz QDR and 850MHz QDR (3200MHz and 3400MHz) clocks which would result with 51.2 GB/s and 54.4 GB/s of system memory bandwidth. Compared to current 25.6 GB/s with the DDR3-1600, this is quite the performance bump. The surprises don’t end there - Kaveri will support DDR3 up to 1250 MHz DDR (2500MHz) – it specifically adds 2400MHz and 2500MHz modes over Trinity, which officially supported up to 2133MHz. Nevertheless GDDR5 would provide a tangible bandwidth improvement and might be the smarter choice given that DDR3 above 1866MHz starts to get prohibitively expensive.

We can anticipate one potential downside with GDDR5, however. Currently most commonly deployed chips only antiestéticature 2 GBit capacity, which tras*lates to 256MB per chip. To fully populate the GDDR5 interface, you need four chips, totaling mere 1GB, which can be considered a bit low for system memory. With two chips per channel we get 2GB, which is still low but starts to get workable. We don't have any hard info on how many chips per channel are possible with GDDR5, on GPUs currently no more than two are used. Either AMD plans to have a crazy amount of GDDR5 BGA soldered to the boards, or they bet on 4 GBit chips which would bring 4GB configurations within reach or the whole GDDR5 support is meant for embedded systems with lower memory requirements. Remember that even Nvidia had to give up on the idea of selling the Tesla K20 card with 12GB memory and had to settle for 6GB. Last year, we talked with representatives from Samsung Memory which refuted the idea of offering higher capacity GDDR5 chips but as the time goes, opinions might change.

The GDDR5 interface inside Kaveri represents a very interesting option. After Sony made their PS4 announcement (8GB GDDR5), this step actually appeares to be a logical evolution of the APU products. Current APUs antiestéticature quite capable GPUs who are hamstrung by low memory bandwidth compared to the entry-level discrete GPUs. Even though GDDR5 memory reduces system flexibility (BGA package has to be soldered to the mainboard), the performance gain may more than make up for that.

Other changes in Kaveri include a PCIe 3.0 interface, which brings the APU in line with Intel's Ivy Bridge and Haswell CPUs. In addition to the 16 lane PCIe 3.0 interface, Kaveri antiestéticatures eight PCIe 2.0 lanes for general purpose use. The PCIe 3.0 interface can be configured as two x8 interfaces, thus enabling Crossfire or Discrete GPU plus Storage Card (Solid State Cards are gaining in popularity). Four of the PCIe 2.0 lanes are used to connect the Southbridge or Fusion Controller Hub (FCH) as AMD calls it, the other four lanes can be used as four x1, two x2 or one x4 connection.

We also tried to clarify if Kabini contains a GDDR5 interface as well, which is scheduled to debut around the Computex timeframe. Given that an PS4 uses significantly modified version of Kabini and utilizes GDDR5 memory, such possibility would not be unreasonable. However, from the documents we were able to look at didn't contain any information on such a antiestéticature. While this is not a 100% confirmation, Kabini with GDDR5 for the PC market probably won't happen.

Once the DDR4 memory comes along the advantages of GDDR5 start to diminish again. DDR4 will enable higher densities at comparable speeds as well as upgradeable modules which might be more desirable for customers. More importantly, we view Kaveri GDDR5 as a trial run, first generation part. As we all know, GDDR5 will be replaced with GDDR6 in 2014 timeframe, foreshadow of what might await us down the road: APUs in BGA package soldered to the board with memory soldered to the board and next to no upgradeability. While this is not desirable from an enthusiast point of view, this is what the mass market will inevitably move to in the next few years.

GDDR6 Memory Coming in 2014 by VR-Zone.com
GDDR6 Memory Coming in 2014
Reported by Theo Valich on Thursday, June 21 2012 3:17 pm

Without any doubt, GDDR5 memory is prevalent high-speed memory of today. The standard attracted a lot of companies and powers systems from graphics cards to networking switches, from cars to rockets and even lunar landers. Thus, the big question remains, when the successor is going to arrive?

You might not know this, but AMD i.e. Advanced Micro Devices is actually the company behind the creation of GDDR memory standard. The company did a lot of great work with GDDR3 and now with GDDR5, while GDDR4 was simply too short on the market to drive the demand for the solution. In the words of our sources, "GDDR3 was too good for GDDR4 to compete with, while everyone knew what was coming with GDDR5."

GDDR5 as a memory standard was designed as Dr. Jekyll and Mr. Hyde. The Single-Ended GDDR5 i.e. Dr. Jekyll was created to power the contemporary processors, while the Differential GDDR5 i.e. Mr. Hyde was designed to "murder Rambus and XDR". Ultimately, the conventional S.E. GDDR5 took off better than expected and clocked higher than anyone hoped for. While the estimates for the top standard were set at 1.5GHz QDR, i.e. 6 "effective GHz" with overclocking, we got both AMD and NVIDIA actually shipping retail parts at 1.5GHz, with overclocks as high as 1.85GHz (7.4 "GHz"). This brought us to more than 250GB/s achiveable bandwidth, meaning that the purpose of Differential GDDR5 was lost.

Enter GDDR6. This is the memory standard that will take us to the 2020 and beyond, i.e. third decade of the 21st century. GDDR6 is being built with a lot of changes and accent on the driving silicon, that we expect this part to last, if not outlast the GDDR3 memory, which launched in 2004 and still makes for vast majority of GDDR memory shipments.

The hard work at AMD is still going strong, with the effort now is starting to be on certifying the standard through AMD-chaired organizations at JEDEC. There is a big number of interested parties, such as NVIDIA, Intel, Qualcomm, Texas Instruments, CISCO and others. However, this is the field where AMD is the company in charge, and regardless for whom are you rooting for, without AMD's memory team - our present (and the future) would look significantly different.
 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Última edición:
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Última edición:
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Solo los usuarios registrados pueden ver el contenido de este tema, mientras tanto puedes ver el primer y el último mensaje de cada página.

Regístrate gratuitamente aquí para poder ver los mensajes y participar en el foro. No utilizaremos tu email para fines comerciales.

Únete al mayor foro de economía de España

 
Última edición:
Faltan doce días para que empiece la GPU Technology Conference, y Nvidia sigue dando la paliza con Geforce GRID y el cloud gaming (que yo sigo opinando que no va a funcionar bien debido a la latencia, y como es habitual, espero equivocarme):
Nvidia: GeForce GRID is "Netflix for Gamers" - Bright Side Of News*
Nvidia: GeForce GRID is "Netflix for Gamers"
3/6/2013 by: Theo Valich - Get more from this author

March will bring us two content-intensive conferences, and we heard from Nvidia that they will have a lot of clarifications. At the upcoming Game Developers' Conference (March 25-29), Nvidia will show the GeForce GRID to the army of game developers, banging the cloud gaming drum with no compromise in quality.

Last years' GPU Technology Conference had several cloud gaming companies stirring the pot, while Nvidia and the associated investors were trying to measure the pulse of what companies can deliver, and who can't. The first company to cash out on the whole deal was Gaikai, whose charismatic leader Dave Perry spent seven month on a hype-raising tour, ultimately selling the company for $380 million to Sony (it was rumored that Samsung came $50 million short).

However, GAIKAI will not be utilizing the GPU technology in its entirety, as Sony wants to use those several million unsold PlayStation 3 consoles to offer a cloud gaming service for PlayStation 4, removing the need for backward compatibility. For Nvidia, that meant only one thing - go alone with their own service, and have products that align nicely to that.

According to an Nvidia representative, GRID is "the upcoming on-demand gaming service from NVIDIA. It's like Netflix's video-streaming service for games, so you can enjoy them wherever you are on your connected device." The company has six partners right now, Agawi, Cloudunion, CyberCloud Networks, G-Cluster Global, Playcast and Ubitus.

However, time will tell is Nvidia's "controlled ecosystem" approach the right one to go with - the company had no design wins on Mobile World Congress with the Tegra 4, Tegra 4i (Project Grey, Tegra 3 with LTE Baseband) had a single design win (ZTE), while the strategy around Project SHIELD is unknown. The upcoming GPU Technology Conference (San Jose, CA) on March 18-21, and the subsequent Game Developer Conference (San Francisco) on March 25-29 will shed a lot of light on what's going on with Nvidia.

Mientras las ventas de PCs siguen bajando, y Samsung le echa la culpa a Windows 8... Yo sigo manteniendo mi teoría de que los PCeros y consoleros somos cada día más minoritarios:
Samsung Exec Blames Windows 8 for Declining PC Market
Samsung Exec Blames Windows 8 for Declining PC Market
01:02 - Saturday 9 March 2013 by Kevin Parrish - source: Korea Times

Samsung says the PC industry will be phased out with the help of disappointing Windows 8 sales.

On Friday Jun Dong-soo, president of Samsung’s memory chip division, told reporters at the COEX InterContinental Hotel in Seoul that Windows 8 has failed to bolster demand for PCs, and that the industry will likely not rebound any time soon. Even more, he said that Microsoft's new Windows overhaul is really no better than Windows Vista based on current market performance.

"The global PC industry is steadily shrinking despite the launch of Windows 8," he told reporters. "I think the Windows 8 system is no better than the previous Windows Vista platform."

He goes even further to say that there will be no expected boost to PC sales thanks to Windows 8's failure, and that the PC industry itself will gradually be phased out. Naturally, this comment stems from a company that seemingly makes the bulk of its revenue from Android-laced smartphones and tablets. Still, the comments hurt.

''Microsoft’s rollout of its Windows Surface tablet is seeing lackluster demand," he said. "Meanwhile, previous vigorous pitches by Intel and Microsoft for thinner Ultrabooks simply failed and I believe that’s mostly because of the less-competitive Windows platform."

Double ouch. He then goes on to question why the prices of conventional memory chips are rising even though the PC market itself is declining. Currently U.S.-based Micron is the #1 supplier with a 51-percent share of the global market, ***owed by SK Hynix (31-percent) and Samsung (15-percent). He claimed that Samsung does not manipulate the chip prices, that the current situation is "surely unhealthy."

Jun's comments arrive after the International Data Corporation (IDC) said that PC shipments in 2013 are expected to decline 1.3-percent in 2013. The forecast is based on poor holiday sales, an "underwhelming" reception to Microsoft's new Windows 8 platform, and a continuing economic "malaise" that further crimped IT budgets in the second half of 2012.

"Although the PC industry had banked on Windows 8 and a more varied and less expensive offering of ultrathin notebooks to revive demand, efforts thus far have been disappointing," the firm said.

A lack of touchscreen components has contributed to a limited supply of touch-enabled Windows 8 models which in turn has hindered sales of the touch-based platform. Those that are on the market appear relatively expensive compared to other options.

Y menuda la que se ha montado con la Xi3 Piston, la "Consola Steam con hardware de PC": Mucho más cara que la PS4, ($1000 frente a $300), con specs iguales o incluso peores (otra vez, ojo a esa memoria):
The Xi3 Piston's specifications we know about reveal it to have 8GB of DDR3 RAM, a Radeon 7000-series GPU and a 3.2Ghz AMD Trinity Quad Core (R464) processor. Three SSDs are available: internal 128GB, internal 256GB and internal 512GB. More details will be shared later, but you can pre-order now from the official site.
Valve-backed Xi3 Piston console starts at $1000 • News • PC • Eurogamer.net
Valve ha salido en seguida diciendo que no es la "consola Steam" en la que aún están trabajando:
Piston Gets P-ssed On By Valve | Rock, Paper, Shotgun
Valve Distancing Itself From Piston, Xi3 Corp.
Y Xi3 ha salido diciendo que sí:
Er, OK: Xi3 Claims Valve Asked Them To Make Piston | Rock, Paper, Shotgun
Xi3 Piston maker counters Valve claim over unofficial Steam Box, issues stark message to Gabe Newell • News • PC • Eurogamer.net

En fin, veremos cómo acaba esto, pero por lo que leo, los PCs que vende Pensamientos Ibéricos (corregidme si me equivoco) son mejores y bastante más baratos que una Piston.. Pagar el triple por tener una caja pequeña y cuca me parece una chorrada, la verdad. Aunque dicen que consume muy poca electricidad, eso sí...
 
Última edición:
Volver