Quantcast
Viewing all 3641 articles
Browse latest View live

Dell Inspiron 13 5000 2-in-1 (8th Gen Core) Review

Pros /

Solid performance; Accurate audio; Decent viewing angles

Cons /

Stiff keyboard; slow SSD; Dim screen

Verdict /

The Dell Inspiron 13 5000 2-in-1 offers solid performance and build quality at an affordable price, but a dim screen and stiff keyboard make this machine a mediocre choice.

Image may be NSFW.
Clik here to view.

With a starting price of $499 ($729 as configured), the Dell Inspiron 13 5000 2-in-1 is a reasonably affordable convertible with solid build quality and a 1080p screen standard. However, Dell cuts a lot of corners on this laptop, from using an uncomfortably stiff keyboard to employing a very slow solid-state drive. If you’re willing to spend $70 to $100 more, you can get a much higher-quality consumer 2-in-1, but if price is paramount, the Inspiron 13 5000 2-in-1 is worth considering.

Specs

CPU Intel Core i5-8250U
Operating System Windows 10 Home
RAM 8GB
RAM Upgradable to 16GB
Hard Drive Size 256GB
Hard Drive Type M.2 SATA SSD
Display Size 13.3
Highest Available Resolution 1920 x 1080
Native Resolution 1920×1080
Graphics Card Intel UHD Graphics 620
Video Memory Shared
Wi-Fi 802.11 a/b/g/n/ac
Wi-Fi Model Qualcomm QCA61x4A 802.11ac Wi-Fi adapter
Bluetooth Bluetooth 4.1
Touchpad Size 4.1 x 2.6 inches
Ports (excluding USB) HDMI
Ports (excluding USB) Combo Headphone/Mic Jack
USB Ports 3
Card Slots SD memory reader
Warranty/Support one year
Size 12.76 x 8.85 x 0.8 inches
Weight 3.45 pounds
Company Website www.dell.com

Design

The Inspiron 13 5000 2-in-1 has a functional, but unimpressive design. Made of gunmetal-gray, matte plastic, the Inspiron at least is a slightly different color than Dell’s many silver Inspirons. The screen area has a thick black bezel with oddly rounded corners that don’t match up with the square lid. This results in an area at the upper left and upper right of the screen where you see the gray plastic layer behind the bezel.

Image may be NSFW.
Clik here to view.

Though the Inspiron looks boring, it feels pretty solid and well-made. The body never buckled or creeked when I held it, and the hinges, which allow you to bend the screen back into tablet and tent modes, are nice and tight. However, I did notice a creaking sound on the right side of the chassis when I pressed down on the underside of the unit near the SD card slot.

At 3.45 pounds and 12.76 x 8.85 x 0.8 inches, the Inspiron isn’t particularly thin or light for its size class. Lenovo’s Yoga 720 13-inch is a mere 2.8 pounds and 0.6 inches thick, while Dell’s own Inspiron 13 7000 2-in-1 is 3.4 pounds, but 0.61 inches thick.

Display and Audio

The 13.3-inch, 1920 x 1080 screen isn’t very bright or vibrant, but it provides sharp images and fairly accurate colors. When I watched a trailer for Avengers: Infinity War, fine details like the lines in the Vision’s brow or the squares on Spider-Man’s suit were prominent. Colors, such as the red in Iron Man’s suit, which appeared somewhat brownish, or the bland blue in Doctor Strange’s costume, seemed believable but dull.

Image may be NSFW.
Clik here to view.

According to our colorimeter, the Inspiron 13 5000 2-in-1’s panel can reproduce a mere 71 percent of the sRGB color gamut. That’s 32 percent less than the ultraportable category average, over 50 percent less than the score from the $799 Lenovo Yoga 720 (15-inch) and nearly 40 percent behind the $849 Inspiron 13 7000 2-in-1’s showing.

At just 188 nits on our light meter, the Inspiron is around 100 points behind the category average and the scores from the Dell Inspiron 13 7000 2-in-1 and Lenovo Yoga 720 (13-inch). However, viewing angles were pretty decent, as colors stayed true at up to 45 degrees to the left and right and faded only slightly at wider points.

The Inspiron 13 outputs audio that’s reasonably accurate and loud enough to fill a midsize room. When I listened to AC/DC’s “For Those About to Rock,” the guitars and drums were only a little tinny, not horribly distorted like they are on many other laptops. If you want to tweak the sound, the Waves MaxxAudio app gives you different sound profiles and the ability to manually adjust the equalizer.

Keyboard and Touchpad

The Inspiron 13 5000 2-in-1’s keyboard is one of the stiffest and least comfortable I’ve tested this year. Laptops with chassis this thick normally have plenty of key travel (1.5 to 2 millimeters), but Dell’s keys provide only a shallow 1.1mm. Though we sometimes find low-travel laptops that provide good tactile feedback, the Inspiron isn’t one of them.

Image may be NSFW.
Clik here to view.

As I pounded the keys, I kept bottoming out or hitting the base with a lot of force. By the time I’d finished the 10FastFingers.com typing test, I had sore fingers and a 13 percent error rate with a speed of only 87 words per minute. My typical scores are between 95 and 105 wpm with a 2 to 4 percent error rate.

The 4.1 x 2.6-inch buttonless touchpad provided accurate navigation around the desktop in our tests. It also responded accurately to multitouch gestures such as pinch to zoom and three-finger swipe. However, the pad is just as stiff as the keyboard, making it unpleasant to click.

Ports

The Inspiron 13 5000 2-in-1 has a decent selection of ports.

Image may be NSFW.
Clik here to view.

The right side contains an SD card reader, a USB 2.0 port and a Noble lock slot. The left side houses two USB 3.0 ports, a 3.5mm audio jack and HDMI-out.

Performance

With its Intel 8th Gen Core i5-8250U CPU, 8GB of RAM and 256GB SSD, our review configuration of the Inspiron 13 5000 2-in-1 was more than powerful enough to handle everything we threw at it.

Image may be NSFW.
Clik here to view.

The laptop scored a strong mark of 12,041 on Geekbench 4, a synthetic benchmark that measures overall performance. That score is more than 50 percent better than the category average and the showing from the Core i5-7200U-powered Lenovo Yoga 720. The Inspiron 13 7000 2-in-1, which has the same Core i5-8250U CPU, scored about 8 percent higher.

Image may be NSFW.
Clik here to view.

The Inspiron 13 5000 2-in-1 took just 3 minutes and 45 seconds to complete our spreadsheet test, a result that’s nearly 2 minutes faster than the category average. That showing is nearly identical to the Inspiron 13 7000 2-in-1’s time and a full 17 seconds quicker than the Yoga 720’s.

The machine’s 256GB SSD is very slow for a solid-state drive, taking 42 seconds to copy 4.97GB of mixed files, for a rate of 121 MBps. That’s 46 percent behind the category average and 62 percent behind the Yoga 720.

Graphics

With its integrated Intel UHD 620 graphics, the Inspiron 13 5000 2-in-1 is quick enough to play videos and run some casual games, but forget about demanding titles. Dell’s 2-in-1 scored a modest 58,043 on 3DMark Ice Storm Unlimited, a result that’s 3 percent less than the category average. However, the Yoga 720 and Inspiron 13 7000 2-in-1 were 18 and 39 percent quicker.

Image may be NSFW.
Clik here to view.

When we fired up racing game Dirt 3, the Inspiron 13 5000 2-in-1 managed a very-playable 47 frames per second, which is about 15 percent above the category average and within 5 frames of scores from the Inspiron 13 7000 2-in-1 and Yoga 720.

Battery Life

The Inspiron 13 5000 2-in-1 lasted a mediocre 7 hours and 1 minute on the Laptop Mag Battery test, which involves continuous surfing over Wi-Fi. That’s over an hour behind the ultraportable category average, but it’s within minutes of the Lenovo Yoga 720’s time and an hour and a half longer than the Inspiron 13 7000 2-in-1’s result.

Heat

The top surface of the Inspiron 13 5000 2-in-1 stayed comfortably cool throughout our tests, but the bottom got a little warm. After the machine streamed video for 15 minutes, the touchpad measured 77 degrees Fahrenheit and the keyboard hit 92 degrees Fahrenheit, both under our 95-degree comfort threshold. However, the bottom inched up to 98 degrees.

Webcam

The Inspiron 13 5000 2-in-1’s 720p webcam is pretty accurate compared to most built-in cameras.

Image may be NSFW.
Clik here to view.

A selfie I took under the flourescent lights of our office had reasonably accurate colors — my beige shirt looked a little gray — and there was a small but palatable amount of visual noise in the background.

Software and Warranty

The Inspiron 13 5000 2-in-1 comes with a few useful Dell utilities and more than its fair share of bloatware. Dell Help & Support allows you to register, check your warranty or connect to support. Dell Power Manager Lite allows you to check the battery health and set it to charge your laptop in ways that increase the number of years your battery will last.

Image may be NSFW.
Clik here to view.

Dell also packs on Dropbox, which comes with a free 20GB of storage for new users, and Netflix, which most people probably have anyway. There’s also a free trial of McAfee Security, which you need to uninstall if you want to stick with Windows Defender or install the antivirus app of your choosing. As with all Windows 10 laptops, there’s plenty of Microsoft-chosen bloat, including Candy Crush Soda Saga, Bubblewitch Saga, Keeper Password Manager and a link to download the Drawboard PDF editor.

Dell backs the Inspiron 13 5000 2-in-1 with a standard one-year warranty on parts and labor.

Configurations

The Dell Inspiron 13 5000 2-in-1 starts at $499. For that price, you get a 1080p screen, but are stuck with a sluggish Intel Pentium 4415U processor, just 4GB of RAM and a 1TB hard drive. Our $729 review configuration features a Core i5-8250U CPU, 8GB of RAM and a 256GB SSD.

Image may be NSFW.
Clik here to view.

You can pay a full $999 to get a model with a Core i7-8550U CPU, 16GB of RAM and a 512GB SSD, but that’s a big price to pay for a laptop with this kind of budget chassis, screen and keyboard. For the best balance between performance and price, we recommend our review model.

Bottom Line

The best description for the Dell Inspiron 13 5000 2-in-1 is “meh.” It has decent performance, a dim but usable display and a plastic chassis that’s functional if not attractive. Picky typists, however, will probably want to steer clear because of the stiff, uncomfortable keyboard. If you can pay just a $100 more, you’ll get a lot more style and a better keyboard and screen from Lenovo’s Yoga 720 or Dell’s own Inspiron 13 7000 2-in-1, but if this is all you can spend, you should consider the Inspiron 13 5000 2-in-1.

(laptopmag.com, https://goo.gl/p4btoD)

 


How NVIDIA killed the GTX 1080

If you’ve seen review of the GTX 1070 Ti, then you heard us say that it offers similar performance to the GTX 1080. Now, let us show you what I mean by that. Here’s how NVIDIA killed the GTX 1080.

Image may be NSFW.
Clik here to view.

For this, we’re gonna compare the performance of the GTX 1080 to that of the 1070 Ti. The test setup we used for both cards is the same one mentioned in our review of the 1070 Ti. For reference, here it is.

CPU Intel Core i7-6700K
Motherboard Gigabyte Z170M-D3H
RAM 2x8GB Corsair Vengeance LPX DDR4 @ 2666MHz
Storage 250GB Samsung 850 Evo + 1TB WD Blue
PSU Corsair RM650i
Case Phanteks Eclipse P400 Tempered Glass Edition

We’re also throwing in the GTX 1070 for some benchmarks since we have access to them. We are using the results from our review of the Lenovo Y720 Cube Gaming PC, which used a Founder’s Edition GTX 1070, and a CPU and RAM configuration similar enough to our test setup, such that net performance wouldn’t be significantly throttled. It packs an Intel Core i7-7700HQ and 16GB of DDR4 RAM.

Now that our setups have been disclosed, let’s begin with the synthetic benchmarks.

Note: Synthetic benchmarks are run at the highest settings at 1080p. The graphics cards were not overclocked.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

It’s so close. The scores of the GTX 1070 Ti sit almost exactly in between those of the GTX 1070 and 1080, but the gap between those two cards was already pretty good in terms of placement within the stack, as well as pricing. Judging from these numbers, getting the GTX 1070 Ti to match the 1080 is only a matter of basic overclocking. Let’s move on to everyone’s favorite, gaming benchmarks.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

Image may be NSFW.
Clik here to view.

At 1080p, the performance jump is fairly significant. Especially if you’re aiming for 144Hz or higher refresh-rates, then the 15-20fps jump is worth it. That isn’t to say that the 1070 Ti can’t achieve those numbers by bumping up the clock speeds. Also, consider that we ran these games at the highest possible graphics settings, so settling for High instead of Ultra will still give you a nice looking game, but exponentially increase your frame rates.

At 1440p and 4K however, you start to see diminishing returns. The performance increase is now generally only <15fps for 1440p, and <10 for 4K. We know that this will vary from game to game, and you can actually see that in the results. However, there is definitely a trend here.

Image may be NSFW.
Clik here to view.

This is actually great for us, the consumers. We are now in a better position than ever when it comes to high-tier graphics cards. A Founder’s Edition GTX 1070 Ti is priced at $449 , which is $100 cheaper than a Founder’s Edition GTX 1080. The price difference for custom cards should be the same. Is the slight performance bump worth that extra $100? I think not. The 1070 Ti arguably has the better value.

From a macro perspective, we can’t help but ask. Why was there a need to compete at this price point? Was it worth cannibalizing the GTX 1080?

The biggest reason on peoples’ minds, is that this price point was previously only occupied by AMD’s Vega 56. This is also the most logical business move for NVIDIA, as AMD is their biggest competitor in the GPU space. You could, however, argue that NVIDIA isn’t at a loss at all, since the GTX 1080 has already been out for a year and a half, and has already sold well.

The “Ti” in NVIDIA’s graphics cards is supposedly short for “Titanium”, which is supposed to signify improved performance and power efficiency. We don’t feel this as much with the GTX 1070 Ti. It’s less of a GTX 1070 Ti, and more of a GTX 1080 Lite. Is the “Ti” losing its meaning? Has it become just a designation for filler cards?

Image may be NSFW.
Clik here to view.

ASUS ROG Strix GeForce GTX 1070 Ti

We sincerely hope not. With all this, there is nothing more we can do. Board partners like ASUS, MSI, Gigabyte, Zotac, and others already have their custom cards out. If you’re in the market for a new graphics card, now you know that the 1070 Ti arguably has a better value than the 1080.

(yugatech.com, https://goo.gl/4j3iNp)

Dell Inspiron 13 7370 Hands-on Review : First Impressions

Earlier today, Dell Philippines officially launched their revamped Inspiron 13 and 15 7000 series of laptops. The company bolstered the ranks of the Inspiron 13 line by releasing two new devices, one of those is what we’re taking a look at here — the Dell Inspiron 13 7370. Packed with the latest Intel 8th gen processor Dell’s new laptop aims to offer performance and portability.

Image may be NSFW.
Clik here to view.

The Inspiron 13 7370 has a simple overall design. Covered in a matte silver finish, the laptop has little intricate designs apart from the Dell logo, chrome accents, and the fingerprint scanner that doubles as a power button. Nevertheless, even with no noticeable design features the device is very well-built as there is only minimal screen flex and the hinges are rock-solid. The whole body itself feels sturdy and compact and it’s also relatively slim and light.

Image may be NSFW.
Clik here to view.

Another interesting thing to take note is that it has a dual exhaust at the back, which is quite a surprise as other laptops with this form factor usually only have one. Not only that but the laptop really isn’t packing that much of a hardware to warrant such a setup, then again we aren’t complaining as this would most certainly keep the laptop cool in any or all situations.

Image may be NSFW.
Clik here to view.

The notebook is equipped with, as the name implies, a 13-inch Full HD IPS display. The panel itself has thin side bezels, although the bottom bezel is evident. Even so, the laptop still has quite the screen real-estate thanks to those minimal side bezels. Sadly, we weren’t able to look at the quality, since the unit’s display was turned off and locked.

Image may be NSFW.
Clik here to view.

On to the keyboard and trackpad. Being a 13-inch laptop, the Inspiron 13 7370 was given a TKL (tenkeyless, no numpad) keyboard layout. Travel distance is quite short and tactile feedback is good, making typing a pleasant experience. The keyboard deck itself is built like a tank as there is no flex at all. Backlighting is also a plus here and brightness is adjustable. However, the arrows keys are smaller than normal which might cause wrong key presses at times.

The trackpad’s surface isn’t entirely smooth but fingers still easily glide on it. Down at its base are two buttons for the left and right click functions, both are springy and have good haptic feedback. We weren’t able to test responsiveness though because, as we’ve said, the laptop’s display was locked.

Left Side

As for the I/O, Dell equipped the Inspiron 13 7370 with a pretty good selection of ports for a 13-inch device. At the left side, we have the power input, USB 3.1 Gen 1 Type-C port, HDMI port, USB 3.1 Gen 1 Type-A port with power share (can be used to charge other devices even when the laptop is turned off), and an Audio port. The right side is a little less crowded with only a Kensington lock, USB 3.1 Gen 1 Type-A port, and a Multi-card reader.

Image may be NSFW.
Clik here to view.

Overall, Dell’s Inspiron 13 7370 is shaping up to be quite the premium device with a humble selection of essential ports, a well-built and constructed body, and a respectable screen size. The device is a perfect choice for anyone looking to have a lightweight and portable machine. Performance is yet to be tested though, but rest assured if we ever get one we’ll certainly test it out.

Dell Inspiron 13 7370 specs:
  • 13.3-inch FHD (1920×1080) IPS Truelife LED-Backlit Narrow Border Non-Touch Display
  • 8th-Gen Intel Core i5-8250U 1.60GHz Processor
  • Intel UHD Graphics 620
  • 8GB DDR4 2400Mhz
  • 256GB PCIe NVMe SSD
  • Standard Widescreen HD (720p) w/ Digital Microphone
  • Backlit, spill-resistant keyboard, Multi-touch gesture-enabled precision touchpad with integrated scrolling
  • WiFi 802.11ac + Bluetooth 4.2
  • USB 3.1 Gen 1 Type-C
  • USB 3.1 Gen 1 Type-A
  • USB 3.1 Gen 1 with PowerShare
  • HDMI 2.0
  • 3-in-1 SD Media Card Reader
  • 2x speakers w/ Waves MaxxAudio Pro
  • Combo headphone/microphone jack
  • 38WHr, 3-Cell Battery
  • Windows 10

(yugatech.com, https://goo.gl/frKXQW)

Dell Inspiron 13 7373 2-in-1 Hands-on Review : First Impressions

Apart from the lightweight Inspiron 13 7370, Dell also released a second laptop to their new Inspiron 13 7000 portfolio. Say hello to the Inspiron 13 7373 2-in-1 laptop, as the name suggests this particular device has a few different modes. The company’s new 2-in-1 notebook is targeted towards consumers in need of a flexible yet light and updated device. We were given a chance to experience this little transformer and here are our initial thoughts.

Image may be NSFW.
Clik here to view.

The Inspiron 13 7373 2-in-1 could easily be mistaken as the 7370’s sibling as they are almost the same in build and looks, save for the fingerprint scanner which is absent in the 7373. The power button though is at the same exact spot, the rest of the device is almost identical even the build quality and structure. The screen has little flex and the keyboard deck is as solid as a rock. The hinges too are relatively well-built, as it should be since it’ll most likely be stressed out from the constant shifting of modes. The laptop itself is quite thin and light, making it a portable enough device to carry around.

Speaking of modes, the laptop has four — Laptop, Tent, Stand, and Tablet. Each mode caters to different kinds of uses depending on the user’s needs, it certainly isn’t the first 2-in-1 out there but it might very well be one of the best in terms of build quality.

Image may be NSFW.
Clik here to view.

The laptop sports the same 13-inch Full HD IPS display as the 7370, only this time it’s a touchscreen panel. The display itself produces nice colors and has good accuracy and viewing angles, it’s also very responsive to the touch and registers gestures relatively quickly. The panel also has thin side bezels although the top and bottoms ones are kinda thick. Nevertheless, the notebook has good screen real-estate and screen to body ratio.

Image may be NSFW.
Clik here to view.

Moving on to the keyboard and trackpad. The Inspiron 13 7373 2-in-1 is equipped with a TKL (tenkeyless, no numpad) keyboard that has good travel distance and tactile feedback. However, like the 7370, the arrow keys are small which might cause some missed key presses at times. Backlighting is also present and brightness is adjustable. As for the trackpad, it has a grain-like surface that is smooth enough for fingers to glide on and is responsive to commands and gestures. The two buttons for the left and right click functions are also good, providing enough haptic and acoustic feedback when clicked.

I/O is basically the same as the 7370. We have a Kensington lock, USB 3.1 Gen 1 Type-A port, and a Multi-card reader on the right and the power input, USB 3.1 Gen 1 Type-C port, HDMI port, USB 3.1 Gen 1 Type-A port with power share, and Audio port on the left.

Image may be NSFW.
Clik here to view.

The Inspiron 13-7373 2-in-1 is perfect for anyone looking to have a flexible notebook. It has a respectable screen, solid build, an above average keyboard, and a slim and light profile. Then again this laptop targets a particular niche and if you are not part of that niche then you might as well just pick up the Inspiron 13 7370, which is basically the same device but with a lower price tag. Nevertheless, anyone needing a 2-in-1 notebook should keep an eye out for the Inspiron 13 7373 2-in-1.

Dell Inspiron 13 7373 2-in-1 specs:

13.3-inch FHD (1920 x 1080) IPS Truelife LED-Backlit Narrow Border Touch Display
360-degree hinge
8th Generation Intel Core i5-8250U 1.60GHz Processor
Intel UHD Graphics 620
8GB DDR4 2400Mhz
256GB PCIe NVMe SSD
Backlit, spill-resistant keyboard, Multi-touch gesture-enabled precision touchpad with integrated scrolling
Supports Pen and Facial Recognition
Infrared camera for Windows Hello
WiFi 802.11ac + Bluetooth 4.2
USB 3.1 Gen 1 Type-C
USB 3.1 Gen 1 Type-A
USB 3.1 Gen 1 with PowerShare
HDMI 2.0
3-in-1 SD Media Card Reader
2x speakers w/ Waves MaxxAudio Pro
Combo headphone/microphone jack
38WHr, 3-Cell Battery
Windows 10

(yugatech.com, https://goo.gl/kKZQ5W)

Dell Inspiron 15 7577 Hands-on Review : First Impressions

The newest addition to the Inspiron 15 7000 gaming series was also launched alongside the two new Inspiron 13 7000 laptops, the Inspiron 13 7370 and 7373 2-in-1. The successor to Dell’s highly regarded 7567 is none other than the Inspiron 15 7577. The new Inspiron 15 has a revamped design and packs some new yet familiar hardware. We were able to briefly tinker with the device and here are our initial thoughts.

Image may be NSFW.
Clik here to view.

The Inspiron 15 7577 is one beefy device, we don’t mean that externally but rather internally as the device is quite heavy. It still carries the same design cues and color from its predecessor — a blocky overall structure with a subtle black and red theme. That is not to say that the notebook is ugly though as Dell did quite a good job at making the device look sleek while keeping its gaming looks low-key. The notebook’s weight also translates to its build as it’s overall structure feels really solid and sturdy, it has minimal screen flex, non-existent keyboard flex, and the hinges are well-built. Its looks can be quite deceiving as the notebook isn’t really something you would consider thin or even relatively slim. Then again gaming notebooks aren’t really known to be slim and light devices, except for a few.

The laptop is equipped with a 15.6-inch Full HD Anti-glare IPS display, which is now the standard display configuration for today’s laptops. Like any other good IPS display, the 7577’s 15.6-inch FHD screen has good color reproduction, accuracy, and viewing angles. It may not have G-sync or high refresh rates and fast response times like other gaming notebooks but it’s still a relatively good display.

Image may be NSFW.
Clik here to view.

Unlike its smaller 13-inch siblings the Inspiron 15 7577 is equipped with a full-size keyboard with your standard 15.6-inch layout. The keyboard has good travel distance and excellent tactile feedback (for a laptop) making gaming and typing a joyous experience. Surprisingly the trackpad is also above average, it has a smooth surface and responds quite well to taps and gestures. The two buttons at the bottom are springy and have good acoustic and haptic feedback.

Image may be NSFW.
Clik here to view.
Left Side

Left Side

Image may be NSFW.
Clik here to view.
Right Side

Right Side

Moving on to the laptop’s I/O. On the right, we have the Audio port, two USB 3.1 Gen1 Type-A ports, a USB 3.1 Gen1 Type-C Thunderbolt 3 port, and an HDMI port. Moving over to the left we have a Kensington lock, the power input, an Ethernet port, a USB 3.1 Gen1 Type-A port, and a multi-card reader. We are quite happy that Dell included a Thunderbolt 3 port on the device as this is a huge plus for anyone looking to at least have an option of using an external GPU.

We are certainly interested in getting our hands on a review unit to really tests the Inspiron 15 7577’s performance. In any case, we are impressed with the laptop’s build quality and overall port selection. Not to mention it also has an above average keyboard and a respectable display. Sadly, with the updated hardware comes with an increase in price and unlike its predecessor the Inspiron 15 7577 now has a higher base asking price. Although, of course, further testing might justify that price jump.

Dell Inspiron 15 7577 (GTX 1050Ti) specs:

15.6? FHD (1920 x 1080) IPS Anti-Glare LED Backlit Display
7th-Gen Intel Core i7-7700HQ 2.80GHz Processor
NVIDIA GeForce GTX 1050Ti, 4GB GDDR5
8GB DDR4 2400MHz
128GB SSD + 1TB 5400 RPM HDD
Integrated Widescreen HD (720p) Webcam with Dual Array Digital Microphone
Backlit Keyboard with Red Print
WiFi 802.11ac + Bluetooth 4.2
HDMI 2.0
SuperSpeed USB 3.1 Gen 1 Type-A w/ PowerShare
Thunderbolt 3
2-in-1 SD card reader
RJ-45
Headphone/microphone combo
56 WHr, 4-Cell Battery
Windows 10 Home

Dell Inspiron 15 7577 (GTX 1060) specs:

15.6? FHD (1920 x 1080) IPS Anti-Glare LED Backlit Display
7th-Gen Intel Core i7-7700HQ 2.80GHz Processor
NVIDIA GeForce GTX 1060, 6GB GDDR5
16GB DDR4 2400MHz
256GB SSD + 1TB 5400 RPM HDD
Integrated Widescreen HD (720p) Webcam with Dual Array Digital Microphone
Backlit Keyboard with Red Print
WiFi 802.11ac + Bluetooth 4.2
HDMI 2.0
SuperSpeed USB 3.1 Gen 1 Type-A w/ PowerShare
Thunderbolt 3
2-in-1 SD card reader
RJ-45
Headphone/microphone combo
56 WHr, 4-Cell Battery
Windows 10 Home

The Dell Inspiron 15 7577 comes in two variants, the base 1050Ti variant is priced at $1,437 while the 1060 variant is priced at $1,708.

(yugatech.com, https://goo.gl/tCNwwB)

Nvidia GTX 1050 Ti vs. GTX 1060 Max-Q vs. GTX 1060: What’s the Best Value?

Just when we thought we had this whole Nvidia Pascal thing figured out, the company throws us a curveball and introduces a Max-Q model of its GTX 1060 GPU.

Image may be NSFW.
Clik here to view.
nvidia_lead

The new addition marks yet another option for entry-level and budget-conscious gamers looking for a system that can deliver solid frame rates while they’re gaming, and support virtual reality without draining their bank account. But which card is right for your gaming/VR needs and your wallet?

What is Max-Q?

Before we dive into the nitty-gritty of specs and pricing and whatnot, let’s talk about Max-Q. The term itself is borrowed from aerospace engineering. Essentially, Max-Q GPUs are designed to fit into thin-and-light gaming laptops like the Asus ROG Zephyrus or the Acer Predator Triton. The drivers on Max-Q cards are optimized for efficiency and power usage, whereas traditional GPUs are tuned for performance. That efficiency quotient also lends itself to an overall quieter laptop, since less power consumption and more efficient performance leads to smaller, quieter fans.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho Nvidia GTX 1050 Ti vs. GTX 1060 Max-Q

Specs compared

In the Nvidia Pascal hierarchy, the GTX 1060 Max-Q is nestled right between the 1050 Ti and the GTX 1060. The Max-Q and 1060 are basically the same card, but thanks to the strict tenets of Max-Q design, the regular card has a base and turbo clock speed of 1,404 and 1,670 megahertz, respectively, compared to the more efficient component, which has a base speed of 1,063 to 1,265 MHz, but can reach speeds of 1,341 to 1,480 MHz. Everything else, like CUDA Cores — the parallel computing platform and programming model used to harness the power of the GPU (1,280) memory clock speed and type (8 Gbps, GDDR5), 192-bit bandwidth and VRAM (up to 6GB) are identical. Nvidia claims that the 1060 is 10 to 15 percent faster than its Max-Q iteration.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho Nvidia GTX 1050 Ti vs. GTX 1060 Max-Q vs. GTX 1060: What's the Best Value?

You’ll find a more pronounced difference between the Max-Q and the 1050 Ti, with the latter sporting only 768 CUDA cores, but with higher base and boost clocks (1,493 and 1,620, respectively). The memory clock speed and type (7 Gbps, GDDR5), 128-bit bandwidth and VRAM (4GB) are noticeably lower. Compared to 1050 Ti, the 1060 is 60 percent faster.

Each of these cards supports Nvidia’s Ansel and G-Sync technology, but none of them can be used in an SLI configuration. Both the 1060 and the Max-Q can run the Oculus Rift and HTC Vive, while the 1050 Ti has limited Rift support, thanks to Oculus’ Asynchronous SpaceWarp technology.

Performance

As expected, the Nvidia 1060 consistently beat both the Max-Q and the 1050 Ti system. However, we were surprised at how close the results were between the Max-Q and the full GTX 1060. For our tests, we used the latest version of the Dell Inspiron 15 7000 Gaming laptop (pictured) with the Max-Q 1060 and a model from earlier in the year equipped with a 1050 Ti. To round out the face-off, we turned to an Alienware 13 outfitted with a full 1060 GPU.

Image may be NSFW.
Clik here to view.
dell_nvidia_gaming

When we ran our traditional gaming benchmarks, we discovered that, more often than not, the Max-Q was only one or two frames behind the full 1060 GPU. For instance, the Alienware 13 notched 32 frames per second on the Rise of the Tomb Raider test on Very High at 1920 x 1080. The Max-Q Inspiron 15 delivered 31 fps, while the 1050 Ti version posted 22 fps.

The Alienware 13 maintained its lead on the Hitman benchmark, scoring 63 fps compared to the Max-Q Inspiron 15, which obtained 55 fps. The 1050 Ti Inspiron 15 managed to top our 30-fps playability threshold with a score of 35 fps.

Switching over to the Grand Theft Auto V test, the Alienware 13 obtained 49 fps, just barely keeping ahead of the Max-Q Inspiron 15, which hit 44 fps at bay. Meanwhile, the 1050 Ti inspiron 15 delivered a playable 31 fps.

To check for VR readiness, we ran the SteamVR Performance test. The Alienware 13 earned a score of 6.9, while the Max-Q Inspiron hit 5.9. Both scores are acceptable for supporting both the Oculus Rift and the HTC Vive. The 1050 Ti Inspiron 15 managed only 3.3, which can’t support the Vive, but thanks to some Oculus technology, it works with the Rift. However, the laptop can easily support any of Microsoft’s MR headsets, including the Acer Windows Mixed Reality AH101 headset.

Which GPU offers the best value?

To learn how much the GPUs add to the overall cost of your laptop, we configured the two Dell Inspiron 15 7000 Gaming laptops and the Alienware 13 as closely as possible. Unsurprisingly, at $849, the 1050 Ti Inspiron is the least expensive option. That nets you a laptop with an Intel Core i5-7300HQ processor, 8GB of RAM, a 256GB SSD and an Nvidia GTX 1050 Ti GPU with 4GB of VRAM. However, for an additional $50, you can get the Max-Q GTX 1060 GPU.

Image may be NSFW.
Clik here to view.
alienware_13_laptop

In the case of the Alienware 13 (pictured), the cheapest model you can get with a full 1060 GPU costs $1,249. For the price, you get a laptop with a Core i7-7700HQ CPU, 8GB RAM, a 256GB PCIe SSD and the GTX 1060 GPU with 6GB of VRAM. However, instead of the gorgeous 2560 x 1440 OLED touch display, you’ll have to make do with a 1920 x 1080 nontouch panel. To upgrade to the OLED screen, you’d have to pay $1,899.

So, which do I choose?

I’m always a proponent of the more power, the better, so I’d typically recommend a system with a more powerful GPU, which in this case would be the GTX 1060. However, the GTX 1060 Max-Q GPU offers comparable performance, including true VR readiness with a less expensive price tag. For instance, the Alienware 13 costs $,1,249 compared to the Max-Q Dell Inspiron 15 7000 Gaming, which is priced at $899.

I really wouldn’t recommend a 1050 Ti laptop, because you lose out on total VR readiness, and it’s significantly less powerful than the Max-Q system. And in the case of the 1050 Ti, it’s $50 less than the more powerful Max-Q laptop. Overall, in terms of balancing cost and performance, Max-Q is the way to go.

(laptopmag.com, https://goo.gl/c7MHYZ)

Qualcomm invades Intel’s turf with Snapdragon PCs that push battery life over performance

Qualcomm showed Snapdragon PCs from Asus and HP that promise two-day battery life and always-on connections.

Qualcomm is invading Intel’s turf, announcing Windows PCs that use the same Snapdragon chips as your phone, with battery life that can last well into a second day of use.

On Tuesday at its Snapdragon Technology Forum, Qualcomm showed off its Snapdragon 835 Mobile PC Platform on a HP Envy x2 tablet and an Asus NovaGo ultrabook. (A third PC, from Lenovo, will be announced at CES in Las Vegas.) Both run on the company’s Snapdragon 835—yes, the same processor (and cellular modem) inside popular phones like the Samsung Galaxy Note 8.

Image may be NSFW.
Clik here to view.
qualcomm snapdragon 835 mobile pc platform

Qualcomm uses its success with smartphones to justify its foray into PCs. You demand all-day performance from your phone, while it’s constantly connected to the Internet. Why shouldn’t your PC deliver the same?

Let’s clear up one concern right away: Qualcomm’s Windows PCs are running Windows 10, not the abandoned Windows RT variant that only ran Microsoft’s UWP apps. However, these PCs emulate non-UWP apps, slowing performance. Qualcomm hopes you’ll be willing to trade some speed for the promise that the Snapdragon Mobile PC platform will deliver 14 to 24 hours of constant use, interspersed with idle periods of “connected standby” time.

What this will mean for you: At some point, the performance of your phone, tablet or PC exceeds your demands—what we call “good-enough” computing. Qualcomm’s betting we’re already there, at least for a chunk of potential users, and it’s focusing on basic productivity, always-on (cellular) connectivity, and battery life.

Many questions hang in the air: Is “good-enough” computing good enough for you? How well does a Snapdragon PC perform on everyday apps that are emulated, such as Google Chrome? How close to reality are these battery life claims? Will customers want to pay for an additional cellular plan? If Qualcomm can deliver on its claims and offer (affordable) always-on WWAN connectivity, a little competition for Intel is always good news for consumers.

Image may be NSFW.
Clik here to view.
the always connected pc powered by qualcomm snapdragon

Qualcomm’s vision of the connected PC, powered by its Snapdragon chip. 

Qualcomm Snapdragon PCs: power, not performance

Keep in mind that chip makers like Intel—and, to a lesser, extent, AMD—are interested in selling you chips that offer the highest performance possible for the lowest price. That’s not Qualcomm’s priority.

“Most people working in these form factors are interested in the connectivity piece, and things like music, email, some productivity, shopping—it’s mostly an extension to a phone,” said Miguel Nunes, senior director of product management for Qualcomm. “We don’t see people using heavy workloads, like graphic design. If they do, they shy away from these form factors.”

“I’ve been using one of these [Snapdragon-powered] devices for several months,” Nunes added. “It’s replaced my Surface Pro device, and I go multiple days without charging.”

Image may be NSFW.
Clik here to view.
hp envy x2 frontrightdetached

The HP Envy x2 will be one of the first to include a Qualcomm Snapdragon chip inside.

Nunes was referring to a Snapdragon-powered version of one of the devices, the HP Envy x2. PCWorld reviewed a similar tablet, the HP Elite x2, which includes an Intel Core m chip inside. The Elite x2 delivered over seven hours of battery life under our tests, which included constant video rundown. Qualcomm claims that same tablet will deliver 20 hours of battery life with a Snapdragon inside of it.

Put another way, Qualcomm believes that a device with a 48 watt-hour battery—basically the battery within the latest Microsoft Surface Pro—will last 21.2 hours when looping 1080p video. (For reference, the Surface Pro lasted about 8.5 hours before running out of battery in our tests.)

Image may be NSFW.
Clik here to view.
asus novago

The Asus NovaGo ultrabook, also powered by a Qualcomm Snapdragon processor.

Qualcomm also released a partial list of the specs of each machine, in part to reassure potential customers that they were getting a “true” PC, with support for the various peripherals and other components that make up a PC.

Asus NovaGo:
  • Display: 13.3-inch, 1920×1080 LED-lit panel
  • CPU: Qualcomm Snapdragon 835 Mobile PC Platform
  • Memory: Up to 8GB
  • Storage: Up to 256GB UFS 2.0
  • Connectivity: Qualcomm X16 modem (4×4 MIMO); 802.11ac (2×2 MIMO)
  • Input: Stylus, two USB 3.1 Type C ports
  • OS: Windows 10 S
  • Dimensions: 12.4 x 8.7 x  0.59 inches, 3.06 pounds
  • Price: $599 for 4GB RAM/64GB storage; $799 for 8GB RAM/256GB of storage
  • Ship date: Undisclosed
HP Envy x2:
  • Display: 12.3-inch WUXGA+ (1920×1200) panel
  • CPU: Qualcomm Snapdragon 835 Mobile PC Platform
  • Memory: 8GB LPDDR4
  • Storage: Up to 256GB
  • Connectivity: Snapdragon X16 LTE modem
  • OS: Windows 10 S
  • Price: Undisclosed
  • Ship date: Undisclosed

Qualcomm executives declined to reveal details of the Lenovo Snapdragon device, which will be announced at CES in Las Vegas.

Interestingly, both the HP Envy x2 and the Asus NovaGo use Windows 10 S. That’s important because in our tests, using Windows 10 Pro significantly reduced the battery life. Testing the Microsoft Surface Book running Windows 10 S yielded a whopping 765 minutes of battery life. “Upgrading” to Windows 10 Pro cut the battery life to 654 minutes, a decrease of 14.5 percent.

Qualcomm’s coming out strong, with a long list of PC partners. Terry Myerson, the corporate vice president in charge of Microsoft’s operating systems division, revealed that hundreds of Qualcomm-powered devices had been in use on the Microsoft campus for months, a fact confirmed by other Microsoft attendees.

Myerson also said on stage that he didn’t plug in his device in a week of use, although it was unknown how he used it.

Snapdragon PCs will be slower, but does it matter?

To its credit, Qualcomm admits that the performance of a Snapdragon PC will be slower than what you get from its Intel- or AMD-based competition. In part, that’s because the Snapdragon chips aren’t designed to process the code natively. Instead, the Snapdragon 835 passes the code through some intermediary steps, including an abstraction layer and emulator. This was a step that Windows RT devices, and the ARM chips they ran on, left out.

Image may be NSFW.
Clik here to view.
Qualcomm Snapdragon 835 PC battery life estimate 2

Qualcomm breaks down the estimated battery life of its Qualcomm Snapdragon 835 PCs here…

As a result, Windows UWP apps (Mail, Calendar, Edge, plus native Skype apps like Twitter, Spotify, and the like) will always run at the chip’s maximum performance. A whole host of traditional .EXE apps, including most games, browsers like Chrome and Firefox, and synthetic benchmarks, simply won’t run as fast.

Image may be NSFW.
Clik here to view.
Qualcomm Snapdragon 835 PC battery life estimate 1

…and here, with a more detailed breakdown of the Qualcomm Snapdragon 835’s power consumption.

Naturally, Qualcomm is downplaying the impact. Any application is dependent upon a combination of CPU, GPU, memory, and storage, Nunes said. “You will see a few differences here and there… but it’s nothing that’s going to impact the user experience.”

“You may see something launch in 1 second, on the other platform it launches on 1.4 seconds,” Nunes explained. “That’s 40 percent slower, but really, that doesn’t matter.”

Nunes instead emphasized the power efficiency of the Snapdragon 835 Mobile PC Platform.  “It’s better to address the battery life, as that’s what most people care about.”

When asked to define the performance penalty that emulation would cost, Nunes again demurred. “Really, it depends on the app,” he said.

Image may be NSFW.
Clik here to view.
Qualcomm Snapdragon 835 PC  performance estimate 1

These are the “performance” metrics Qualcomm wants you to think about when considering Snapdragon 835 Mobile PCs.

Fortunately, it seems like some of these concerns may be overblown. Pat Moorhead, a former AMD employee and now an independent analyst, told PCWorld that he had used a Qualcomm-powered product for several days and that Chrome performed acceptably—“better than I expected,” he said.

Qualcomm feels it can make up some of that gap as more powerful CPUs throttle themselves, reducing the clock speed under load to control heat output. But executives admit you’ll notice differences between native and emulated apps. In part, that’s due to a design quirk in the ARM architecture that underlies the Snapdragon chip: ARM uses a combination of “big,” powerful cores together with more power-efficient “little” cores. It’s these little cores that help provide the long battery life, but can’t quite keep up when under load.

Snapdragon PCs will test our taste for “good-enough”

For the average user, the Snapdragon 835 Mobile PC Platform’s strengths and weaknesses play into how the PCs will be marketed. When Asus and HP ship the new Qualcomm-powered devices, the companies will emphasize “user experience” metrics like how fast apps will open, rather than synthetic benchmarks like Cinebench or PCMark.

Also, the apps you use will matter. “You can run Chrome on it. Edge is significantly more optimized, but Chrome will work,” Nunes said.

If Qualcomm’s Snapdragon 835 Platform Mobile PCs are to take off, consumers are going to have to accept PCs “good-enough” performance in exchange for two days or so of battery life and pervasive WWAN connectivity. So far, we’ve had two architectures try that argument on for size: Intel’s Core m and Atom chips. Intel’s Core m arguably succeeded. The Atom did not, offering a sluggish computing experience often compounded by anemic storage and memory.

For years, however, consumers have pleaded with smartphone makers to increase the battery life. Now Qualcomm has the opportunity to help PC makers do the same. Will they succeed? As soon as we can get devices in our hands, we’ll tell you.

(pcworld.com, https://goo.gl/i3HarX)

Android 8.1 Oreo is here: What’s new, what’s changed, and what’s awesome

Everything you need to master the latest major release of Android.

Image may be NSFW.
Clik here to view.
android oreo livestream

Get your phones ready because Android Oreo is finally here. But its name isn’t the only thing that’s sweet about Android 8. While it might not be as jam-packed with features as prior Android releases, Android Oreo has plenty of features that make it a must-download, from picture-in-picture to notification changes that will help you keep annoying alerts at bay. And now Android 8.1has arrived to bring even more awesome features and enhancements. So bring your sweet tooth because there’s a lot to chew on.

And you don’t need a Nexus or Pixel phone anymore to enjoy it. The first non-Google phones are finally starting to support it, and more will be added soon. So, before we look ahead to what confectionery Android P will adopt (my money’s on Petit Four), check out the expanding list below and find out when it’s coming to your phone, and read up on all the new stuff while you wait:

Android 8 Oreo update FAQ

Can I install it on my phone?

As with any new Android release, the devices on which you can install Oreo are extremely limited. Here’s the list:

  • Pixel
  • Pixel XL
  • Nexus 5X
  • Nexus 6P
  • Pixel C tablet
  • Nexus Player set-top box
When’s it coming to my non-Google phone?

Other than the devices above, you’ll need to wait for manufacturers and carriers to begin rolling out their own versions of the OS. So far, only four phones support Oreo out of the box:

  • Pixel 2
  • Pixel 2 XL
  • Mate 10
  • Mate 10 Pro

As for the other manufacturers, Google says it’s working with its partners to deliver Android 8 to phones “by the end of this year,” and there are a lot betas already in the works from Samsung, Essential, HMD (Nokia), Huawei, HTC, LG, Motorola, OnePlus, and Sony. OnePlus says an Oreo update will be available by the end of the year and HTC is promising an update to the U11 and U11 Ultra by the early December.

OK, I have one of the supported phones. How do I get the update?

Once the Android Oreo update is ready for your phone, you’ll receive a notification of a pending system update. Tap it and you’ll be taken to the Settings app where you can proceed to download and install it. If by chance you want to install the update manually, you can find the factory images for Pixel and Nexus devices here.

Do I need to unenroll from the beta program first?

Nope, there’s no need to do that. Even though your phone will continue to say you’re enrolled in the beta program, once you get the update, you’ll still be running the final version of Android Oreo, just like everyone else. And as new betas land for 8.1 and beyond, you’ll be among the first to get them, too.

Won’t Project Treble help me get updates quicker?

Project Treble is more of a foundation for the future than a current user feature. Designed to make it easier for manufacturers to deliver timely updates, it will presumably mean that your Galaxy S and LG G phones won’t have to wait as long to get the latest version for Android. However, it’s probably not going to affect the speed of Android 8 updates.

As Android’s engineering team explained in a recent Reddit AMA: “Devices launching with Android O will come Treble-enabled out of the box. Project Treble will make it easier, faster and less costly for device maker partners when these devices are updated in the future.” So, while Android P might make it to non-Pixel phones quicker, it won’t have an effect on Android Oreo updates.

Android 8 Oreo features

When you launch Android Oreo for the first time, you won’t be smacked with any obvious new features—but there are still a few worth checking out. Google has divided its improvements into two main areas: “Fluid experiences,” which bring productivity and UI changes to help get things done faster, and “Vitals” to keep your phone running smoothly while demanding less battery power.

Here’s everything that’s new in Oreo:

Settings

Image may be NSFW.
Clik here to view.
settings android o

The Settings app in Android Oreo (left) has gotten a facelift as you can see in this comparison with Android Nougat (right).

The most obvious change to the interface and navigation can be found in the Settings app. There’s a new icon inspired by Oreo’s aqua marine-accented motif, and many of the menus have been rejiggered and rearranged. Gone are the categories for Wireless & Networks, Device, Personal, and System. Instead, various settings have been given smarter groupings. For example, Network & Internet collects Wi-FI, mobile, date usage, and hotspot into a single screen, while Connected Devices does the same for Bluetooth, Cast, NFC, and Android Beam.

Individual settings screens have been tweaked as well. Tap on the Battery tab, for instance, and you’ll see a new visualization of remaining run time (tap it to get back to the old chart), as well as toggles for battery saver and adaptive brightness, and the inactivity sleep timer. You’ll have to explore yourself to find out where everything is, but if you get lost, you can still use the handy search icon in the top right corner.

Picture-in-picture

Image may be NSFW.
Clik here to view.
android o pip

When you’re watching a video in Chrome using Android 8, you can turn it into a picture-in-picture window on your home screen.

Of all the new stuff in Oreo, the feature everyone is going to want to try out first is picture-in-picture. It doesn’t yet work with a lot apps, but it’s a feature developers will likely want to support as quickly as possible. Using it is easy. When you’re watching a full-screen video in YouTube or Chrome, just press the home button and the video will shrink down to a  window that floats on top of whatever else you’re doing.

From there you can move it around the screen, close it out, or tap to launch the app again. It’s a feature that’s sure to be more useful on Android Oreo tablets than phones, but on the giant screen of the Nexus 6P, the tiny window is definitely watchable.

Autofill

Image may be NSFW.
Clik here to view.
android o autofill

Autofill will be super-charged to work with third-party password managers.

Picture-in-picture might be Oreo’s coolest feature, but its most useful one will likely be autofill. I know, you’re thinking, “We’ve had autofill in Android for years,” but the new approach applies directly to passwords, and goes a step beyond Google Chrome’s Smart Lock feature.

Just like you can customize Android’s keyboard with a better one, now you can customize password management with a third-party platform. And it works all over Android, not just in Chrome. That means when you reach an app that requires a saved login in Android 8, the fields will automatically populate using info from your personal password vault. And it’ll work with your password manager of choice: Dashlane, 1Password, and Enpass have already announced support for autofill in Android 8. So if you aren’t using a password manager, now’s a great time to start.

Notifications

Every new Android release includes some changes to notifications, and like Nougat, Android Oreo brings some pretty big ones. Its starts with the notification shade. The quick settings panel is now white instead of black, and the Settings app shortcut has been moved to the space below the icon strip. A couple of the quick settings tiles have changed as well. The battery icon has been replaced with Battery saver, but you’ll still be able to see your remaining battery life in the status bar (previously it disappeared when you pulled down the shade). And there’s a new System icon that tells you the version of Android you’re running. The Night Light tile is gone as well.

Image may be NSFW.
Clik here to view.
android o notifications

The notification shade has gotten some new options in Android O.

The way notifications are handled has also changed. Swipe right and you’ll see two icons: Settings and a new clock—touch the clock to snooze the alert for up to two hours. Also, if you long press on a notification, you’ll be able to turn off all future alerts. On some apps you’ll see a simple switch, but others will have a Categories button, which lets you get granular with what notifications you receive. So, instead of an all-or-nothing decision, you can now choose what type of notification “channels” you will receive without needing to fuss with the individual app’s settings.

In Maps, for example, there are 31 separate categories that can be switched off or silenced, so if you want to be alerted of location sharing but not new places, you can do that right in Settings. Most third-party apps don’t have any options yet, but once they start rolling out, notification categories should help you keep your notification shade a whole lot neater.

Image may be NSFW.
Clik here to view.
android o notification dots

Android O puts small dots on icons to alert you to unread notifications. Then you can long press to see them.

Finally, Android 8 is introducing icon badges—or as Google calls them, dots—for unread notifications. They won’t display a numeral that indicates the specific number of unread notifications (a feature in Nova and other launchers), but the dots will give you a visual indication that an alert has arrived. They’re visible whether the app is on the home screen or inside the app drawer, and if you long press on an app icon, you’ll see your unread notifications. Tap to open them in the app, or clear them with a swipe.

Smart text selection

Image may be NSFW.
Clik here to view.
android o smart text

Text selection has gotten a whole lot smarter in Android O.

Another useful feature in Android Oreo is smart text selection, which aims to cut down on various test-handling frustrations. When you tap on an address in Oreo, the text-selection engine will be smart enough to recognize a full address, not just the word you’ve tapped on. And once it’s selected (by double-tapping the original highlighted word if it didn’t get it the first time), you’ll see a new option to head straight to Google Maps or (in the case of a phone number), the Phone app. There’s also a handy new “Paste as plain text” option that will strip any formatting.

Battery improvements

Google has optimized much of Android Oreo behind the scenes to make your battery last longer, but there are a few things you can see. In the notification shade, a persistent notification will now alert you to any apps that are running in the background. You can also finally opt to display your battery percentage next to the status bar icon at the very top of your display.

Image may be NSFW.
Clik here to view.
android o battery

There are lots of battery improvements in Android O, but you won’t be able to see most of them.

But the real improvements to battery life will be under the hood. Google has concentrated its efforts on three main areas: implicit broadcasts, background services, and location updates. That means that Android Oreo will severely limit what apps can do when you’re not using them, so rogue operators won’t be able to harpoon your battery life. Again, most of these changes won’t affect your daily use—you’ll still be able to play Spotify songs and get directions—but users of older phones should notice an uptick in their battery life.

Speed boost

Google understands our pain when it comes to Nougat boot times, and it has seriously upgraded Android 8 to cut down on the time it takes to load. All Android Oreo devices should see a significantly reduced boot speed, but Pixel owners will particularly benefit. Google says boot times on the Pixel and Pixel XL have decreased by about half of the time it took to load under Nougat, and the upcoming Pixel 2 will surely push it even further.

Icons

Image may be NSFW.
Clik here to view.
android o icons

Google wants icons to be more uniform and adaptive in Android O.

Android Oreo is introducing adaptive icons in an effort to create some unity over how they look. Much like last year’s push for circular icons with Nougat and the Pixel Launcher, Oreo is pressing developers to submit icons that can dynamically change with the system, so they can be square on one phone and circles on another without upsetting the overall vision for the icon. The new system also allows developers to add visual effects and subtle animations to their icons, such as parallax or scaling effects.

Emoji

Image may be NSFW.
Clik here to view.
new emoji android

The familiar “blob” emoji (above) are gone in Android O, replaced with much more cartoonish ones (below).

Here’s something you will definitely notice in Android Oreo: New emoji. Say goodbye to the blobs and hello to a new set of easier-to-distinguish cartoons. But the old blobs aren’t gone completely: You can download the old-school emoji as an animated sticker pack in Google Allo.

What you can’t see

Android Oreo features a slew of under-the-hood and behind-the scenes improvements to make your smartphone experience even better: things like background app limits, smarter Wi-Fi, and Adobe RGB and Pro Photo RGB color profiles. Take a look at some of the more technical improvements in Android 8 here (https://goo.gl/UKuvi3)

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho Android 8.1 Oreo is here: What's new, what's changed, and what's awesome

Pixel Visual Core

Android 8.1 is more of a maintenance release than a feature one, and most users won’t notice many things that are different. But it does bring one big change to the Pixel 2 and Pixel 2 XL: It unlocks Pixel Visual Core, Google’s first custom-designed coprocessor dedicated to image processing.

For whatever reason, the chip was dormant in 8.0, but now developers can tap its benefits. All we know for sure is that Visual Core improves the speed and power efficiency of shooting in HDR+ mode. Most users probably won’t notice an immediate change, but the Pixel Visual Core could lead to bigger changes in the future.

Automatic dark and light theme

Image may be NSFW.
Clik here to view.
android 8.1 dark

Android 8.1 (left) includes a dark mode. Well, kinda.

When Android Oreo launched, one of the things it was missing was a dark theme, a feature that has been teased in developer previews for years. However, while there isn’t a switch to turn the interface dark, Pixel 2 users discovered that they could “trick” Android into displaying a dark background on the app drawer and notification shade by picking a dark wallpaper. In Android 8.1, all users can now enjoy the pseudo dark theme.

New cheeseburger emoji

After a days-long kerfuffle, Google has admitted to the world that its cheeseburger emoji is wrong. In previous versions of Android, the cheese rested on the bottom bun in an affront to hamburger lovers everywhere. In Android 8.1, order has been restored, and the emoji has been redesigned to put the cheese on top of the burger.

Ambient display

Google’s ambient display in Android 8.0 introduced a minimal look that might be too minimal for some users. The new ambient display in 8.1 now includes the date above any prior notification icons, and an alarm, if one is set, to match the one that shipped with the Pixel 2.

Redesigned power menu

Image may be NSFW.
Clik here to view.
android 8.1 power menu

The power menu in Android 8.1 (right) is much less obtrusive than it was in Android 8.0.

When you press the power button to shut down or restart your device in Android 8.1, the options will no longer take over your entire screen. Instead, a small window will appear on the right side of the screen. It’s a minor change, but it speaks to how light and unobtrusive Android is getting.

Android Go

Android Go is a stripped-drown version of the full Android release designed specifically for devices with 512MB to 1GB of memory. It’s meant to boost the speed and reliability of entry-level devices, as well as provide security and reliability that’s often missing in low-end phones.

In addition to a leaner and faster OS, Google has also built a set of optimized apps that are smaller than their full Android counterparts. Users will still be able to download full versions of any apps available in the Play Store, but pre-loaded Google apps—including the Google app, Google Assistant, YouTube, Google Maps, Gmail, Gboard, Google Play, Chrome, and the new Files app—will be optimized to run faster with less memory. Google says developers are building “Go” versions of many popular third-party apps, too.

While Android Go is built into Android 8.1, it will take several months before the first devices to use the new OS arrive.

Android 8 Oreo tips

Get notification dots to appear on your Nexus 5X or 6p

If you own a Nexus 6P or 5X, you probably aren’t seeing notification dots on your phone. That’s because the feature requires the Pixel Launcher. But don’t worry, it’s an easy fix. Head over to APKMirror and download the Google-signed Pixel Launcher APK for Android 8. Install it on your phone and head over to Settings app. Go to Apps & notifications, tap Default apps, then Home app, and select Pixel Launcher as your default. As long as you keep it as the default, you’ll get to enjoy notification dots on your apps.

Choose which apps display notification dots

You can opt to disable all app dots in the Notification settings, but if you want more control over which app gets to display the blue dot, each app has its own toggle. To tweak the settings for each app, go to Settings, then Apps & notifications, App info, and finally App notifications. Inside you’ll see an Allow notification dot toggle.

Enable or disable Picture-in-Picture for individual apps

Image may be NSFW.
Clik here to view.
android o pip settings

Android O gives users control over the PiP settings.

Picture-in-picture is a very new Android feature, and as such, it only works with a couple apps, namely YouTube and Chrome. However, Google hasn’t limited the feature to video apps, and you’ll soon be able to have all sorts of apps floating around your home screen. To see which ones can be used, open Apps & notifications in Settings and select Special app access. Inside there will be a Picture-in-picture option. Tap it and you’ll be able to see all the apps available to use Picture-in-picture, each with their own toggle to enable or disable the feature.

Show battery percentage in the status bar

Image may be NSFW.
Clik here to view.
android o battery percent

You can finally put the battery percentage in the status bar in Android O.

Nearly every Android phone includes the option to display the battery percentage next to the icon in the status bar—except it wasn’t a stock feature until now. In Android Oreo, you’ll find a toggle inside the new Battery tab in Settings. Flip it on and you’ll always know exactly how much juice you have left.

Choose your autofill provider

Once password managers start updating their apps with support for Android Oreo’s Autofill, you’re going to have to pick one as your default. Just like with keyboards, you can find the option in the Languages & input tab in Settings. Tap Autofill service and you’ll get a list of any apps that support autofill (including Google’s own service), and you’ll be able to select the one you want.

Adjust Night Light

Night Light was one of our favorite new features in Android Nougat, but Google didn’t allow any control over it. That’s changed in Android 8. Head over to the Display tab in the Settings app and you’ll find a new intensity slider below the Night Light toggle. The higher you turn it up the less blue light is emitted, and the yellower your screen will appear. (Note: Night Light only works on Pixel phones.)

Limit background activities on older apps

Apps that haven’t been upgraded to take advantage of the new Oreo APIs won’t get to take automatic advantage of the new limit on background activities, but there is a way to force it. Head over to the Settings app and tap App info inside the Apps & notifications menu. Find the app you want, select it, and you’ll see a Background activity toggle. Flip it blue and Oreo will limit what it can do when you’re not using it. For other hidden tips and tricks, check out our article here.

Reveal the Android Octopus

Image may be NSFW.
Clik here to view.
android o octopus

The Easter egg in Android Oreo isn’t a cat… it’s an octopus. Eight legs, get it?

Just like in prior Android releases, Google has hid a fun easter egg in Oreo. Go to Settings and scroll all the way down to the System tap. Tap it, then About phone. Tap on Android version a bunch of times and you’ll see a giant Android O symbol pop up on your screen. Long-press on the center until you feel a vibration (it might take a couple attempts) and an animated octopus will appear on the screen that you can stretch and drag around.

(pcworld.com, https://goo.gl/L4a1NZ)


ASUS NovaGo and HP ENVY x2 hands-on review : Snapdragon’s Windows stars

Welcome to the familiar: I’ve just spent some time with the first Windows 10 on Snapdragon devices, HP’s ENVY x2 and ASUS’s NovaGo, and they’re both new and old in one. Unveiled today at Qualcomm’s Snapdragon Summit, the two new ultraportables – one a 2-in-1, and the other a detachable tablet – promise a whole new way of working on the move. In your hands, though, they feel much like the Windows 10 machines we’ve already seen.

Image may be NSFW.
Clik here to view.
ASUS NovaGo and HP ENVY x2 hands-on: Snapdragon’s Windows stars

The HP ENVY x2 is, effectively, the Snapdragon 835 version of the existing Spectre x2. Whereas the Spectre uses Intel x86 chips, the ENVY x2 packs Qualcomm’s Snapdragon 835. The end result is a Windows detachable that feels familiar, though should have some big improvements in usability.

Image may be NSFW.
Clik here to view.

First things first. HP’s 12.3-inch WUXGA+ touchscreen is bright and has decent viewing angles, while the magnetically-attached keyboard cover has some neat detailing with its metal hinge. That allows the ENVY x2 to be adjusted between 110- and 150-degrees, which you’ll be glad of since the screen is highly reflective.

HP sensibly included a loop for the digital pen, used for Windows Ink. Tug the tablet free of the keyboard and it’s light enough – at 1.54 pounds – to be used one-handed. You might not want to, however, as the keyboard itself is solid for a detachable, eschewing things like the Surface Pro’s fabric in favor of backlit plastic buttons with a reasonable 1.3mm of travel.

HP ENVY x2 Gallery:

If you’re sticking with the keyboard, however, you might just want to head over to ASUS. The NovaGo doesn’t mark much of a departure in design from the company’s other affordable Windows 10 ultraportables, but that’s no bad thing when it means it’s lightweight and reasonably slim.

Image may be NSFW.
Clik here to view.

Despite that, you get two USB 3.1 Type-A ports, whereas the HP only gives you USB-C. ASUS finds space for an HDMI port, too, and the NovaGo has a bigger trackpad. I didn’t have long to play, but the keyboard lacks annoying bounce, and the hinges are sturdy and don’t allow the whole notebook to flop around or sag.

Really, though, the sum of these two computers is about more than just how they look. Qualcomm and Microsoft’s big boast is how much more efficient their Snapdragon heart is: up to 20 hours of battery life in the case of the HP, and up to 22 hours from the slightly heavier ASUS. That’s despite niceties like integrated 4G LTE cellular connectivity and the benefit of instant-on.

Image may be NSFW.
Clik here to view.

Like the Surface Laptop, both will come with Windows 10 S. However, also like the Surface Laptop, you’ll be able to upgrade on-device to Windows 10 Pro. We’ll have to wait and see, not only how long practical battery life is in the real-world, but whether full apps designed with x86 machines in mind are going to run successfully on Qualcomm’s hardware.

Image may be NSFW.
Clik here to view.

If they can, and if the pricing for 4G LTE connectivity is right, these two machines could represent a very interesting evolution in Windows computing. One of my key complaints about the Surface Laptop, for instance, was its battery life and connectivity simply didn’t make up for the limits of Windows 10 S. Double the battery and throw in Gigabit LTE, however, and suddenly you might have something to really get consumers excited.

ASUS NovaGo Gallery:

(slashgear.com, https://goo.gl/WydCWu)

Windows 10 on ARM: 4 facts you need to know

In what could possibly be the biggest shake-up to Windows in years, Microsoft and Qualcomm announced the first ARM-powered Windows 10 PCs today, promising all-day battery life and ubiquitous connectivity. With Qualcomm’s Snapdragon 835 at their heart, the new ultraportables may look like their Intel and AMD counterparts at first glance, but there are some very good reasons inside why you might want one of these ARM-based models instead. Read on for what you need to know.

Image may be NSFW.
Clik here to view.
Windows 10 on ARM: 4 facts you need to know

The battery life promises are huge

Your smartphone lasts all day; your laptop manages maybe half of that, if you’re lucky. Now, Qualcomm wants to bring the sort of power expectations from phones and tablets to laptops and 2-in-1s. One of the big advantages of a chipset like the Snapdragon 835, which was created with devices that have small batteries in mind, is that it’s far more frugal with its power needs than a regular x86 Intel or AMD processor.

Image may be NSFW.
Clik here to view.

Couple that with the possibility of a large battery in a tablet or notebook form-factor – and the added benefit of the Snapdragon 835 mainboard being physically smaller than those of its x86 rivals too – and you can fit a whole lot of power into the same form-factor. According to Qualcomm, 20+ hours of runtime is realistic, or over 30 days of standby.

Now, battery test methodologies are the dirty secret of the tech industry, so we’re still a little skeptical about exactly how well Snapdragon-based computers will hold up. Real-world testing will have to wait until the first models are on sale, though ASUS has said that its NovaGo can play video for up to 22 hours solid. Unsurprisingly, if you’re doing something more system intensive than just looping footage then the battery will drain more quickly, but it’s still shaping up to be significantly better than a traditional notebook could manage.

Expect Windows 10 S out of the box

Windows on Snapdragon wouldn’t work if developers had to make special apps, or if you were only limited to mobile software. Happily, that’s not the case: the HP ENVY x2 and ASUS NovaGo will run Windows 10 S. It’s the same version of the OS as on the Surface Laptop.

Image may be NSFW.
Clik here to view.

Microsoft is targeting it at students and similar, and there are a few limitations compared to a regular copy of Windows 10 Pro. For a start, you can only download and install software from the Windows Store – that’ll include things like a “specially optimized” version of Office 365 for Snapdragon-powered machines, Microsoft says. As on the Surface Laptop, however, if you really want full Windows 10 Pro you can get it.

The exact process hasn’t been confirmed yet, but our expectation is that you’ll pay a one-time fee and unlock the full OS. It’s unlikely that you’ll be able to backtrack to Windows 10 S, though, but that’s not our primary concern. What’s uncertain at this stage is just how well the Snapdragon 835 will hold up, running regular Windows 10 apps. Low power requirements are great for prolonging battery life, but not when you need raw performance: if you were hoping for a 4K video editing system that lasted 20+ hours away from the charger, you’re out of luck.

Cellular will be the hard sell

One of the big advantages about being based on a smartphone chipset is that cellular connectivity is designed in from the outset. The Snapdragon 835 in the Windows on ARM machines will be accompanied by Qualcomm’s X16 Gigabit LTE modem, with support for up to 1 Gbps downloads, network depending.

Image may be NSFW.
Clik here to view.

While you’ll obviously have WiFi too, our expectation is that you’re probably going to struggle to buy a NovaGo or ENVY x2 without activating a data connection at the same time. Sadly there’s no carrier subsidy, unlike with a phone, to lower the cost; however, networks will almost certainly be incentivizing retailers to push data activation on new sales.

How much that will cost you every month is similarly unclear at this point. In an ideal world, you’d be able to add a Windows on Snapdragon ultraportable to your existing cellular plan for a small monthly fee, and use whatever data allowance you’re already relying on for your smartphone, tablet, hotspot, car, or smartwatch. Carriers, however, might see the potential for heavy data use on a notebook form-factor device being considerable, and thus prefer to push more expensive plans.

ASUS and HP are first, but they won’t be alone

So far, we’ve seen two Windows on Snapdragon machines. The ASUS NovaGo is expected to reach the market first, a 2-in-1 convertible notebook with 360-degree hinges and a touchscreen. It’ll be priced from $599 for a basic-spec model, or $799 for one with considerably more storage and more memory.

Image may be NSFW.
Clik here to view.

Meanwhile, the HP ENVY x2 will borrow the detachable form-factor of the PC maker’s Spectre series. It’ll work as a tablet, complete with a digital stylus for Windows Ink, or with a magnetically-attached, backlit keyboard. Pricing hasn’t been confirmed at this stage, with HP only saying it’ll announce that closer to the ENVY x2’s Spring 2018 release.

There’s another big name with hardware waiting in the wings, mind. Lenovo is confirmed as a Windows on Snapdragon partner, but won’t be bringing out its machine to play until CES 2018 in January. It’s unclear what form-factor it’ll adopt, though given it has a large Yoga line-up of 2-in-1s with rotating hinges, that seems like a safe bet.

Wrap-up

The potential of a full OS on a low-power mobile processor is something we’ve been hearing about for years. It’s only, arguably, with the very latest chipsets that we’ve reached the point where that sort of combination is practical. Nonetheless, there are still some big questions lingering, that only real-world experience will settle one way or another.

Image may be NSFW.
Clik here to view.

Can Windows on Snapdragon notebooks and 2-in-1s really deliver all that battery life without compromising on everyday performance? And will their competitive price tags be unduly offset by egregious data fees? Perhaps more fundamental, is ARM at the core the end-goal for Windows 10 devices without discrete graphics chips? That may still be some way out, but it could have a huge impact on not only Intel and AMD’s businesses, but that of other companies relying on their low-power chips, including Apple.

(slashgear.com, https://goo.gl/rAXJn4)

HP Omen X VR Backpack Review

You’ve never seen a PC like HP’s Omen X Compact Desktop. It’s a powerful gaming rig, and it’s small enough to do double duty as a pseudo console under your TV. But it also has a built-in battery. And when attached to HP’s revamped VR backpack accessory and its mixed reality headset, you can experience high-end virtual reality in a completely new and freeing way. The only problem? The entire package will cost you close to $3,500. That puts it far out of reach for the vast majority of gamers, even those who don’t mind shelling out for the latest hardware.

Pros

  • Powerful hardware can handle 4K gaming
  • Portable enough to move around easily
  • Built-in battery is convenient
  • Comes with a dock

Cons

  • Expensive compared to full-sized desktops
  • No SD card reader

Summary

HP’s Omen X Compact Desktop is expensive, but it’s an ingenious gaming device. It can handle all of your PC gaming needs, but you can also throw it in a backpack for lan parties, or place it right alongside your game consoles.

Hardware

True to its name, the Omen X Compact Desktop is sleek and relatively portable. On its own, it weighs 5.5lbs — lighter than popular gaming notebooks from Alienware and Razer. The desktop also shares the same aesthetic as HP’s Omen 15 and Omen X laptops. It’s got a sharp, angular design with a stylish black case and red accents. It makes a statement on its own, but it really shines when you place it in the bundled dock, which turns it into a monument to PC gaming.

The dock gives you an easy way to connect the desktop to your workspace, and quickly remove it, without having to deal with plugging in cables. The front features two USB 3.0 ports, as well as USB-C. While the back has another 3 USB connections, an ethernet jack, as well as DisplayPort and HDMI. There’s also a power connection for recharging the batteries from HP’s VR backpack.

Image may be NSFW.
Clik here to view.

The compact desktop, meanwhile, has two USB ports up top, USB-C, HDMI, MiniDisplayPort, a headphone jack, and, conveniently enough, a power connection for the HTC Vive headset. On top of that, there are also two USB ports along the lower side of the desktop. There’s no SD card slot on the the computer or its dock, which seems like a surprising omission for such a fully featured setup.

As for HP’s Mixed Reality headset, it’s in line with what we’ve seen from other Windows VR gear. It’s relatively light and, most importantly, easy to put on and take off. There’s a liberal amount of padding around the face area and head band, which insures a comfortable fit. There also aren’t too many straps to mess with: you just loosen the headband with a rear dial, and tighten it once you’ve put it on. Thankfully, you can flip the visor portion up, allowing you to see the real world without removing the entire devices. That’s one of the more useful features we’ve seen from Mixed Reality devices.

Image may be NSFW.
Clik here to view.

Each of the headset’s lenses features a 1,440 by 1,440 resolution at 90Hz — the same specs we’ve seen on most Windows Mixed Reality headsets. There’s a headphone jack along the bottom (you’ll have to supply your own), as well as a short built-in cable. The latter is particularly helpful, since it lets you use a long cable to connect to the dektop normally, but you can also swap it out for a shorter cable to use with the backpack.

As you’d expect, HP also includes two mixed reality controllers. Each features a large sensor ring for spatial tracking, as well as a thumbstick, and a trackpad that can also be recognized as four separate buttons. There are also the usual things we see on every VR controller these days: trigger and grab buttons, along with menu and home options.

Tying all of these new gaming devices together is HP’s VR backpack, which is meant to let you experience virtual reality without being tied down to a large desktop. It sports padded straps and a plastic panel, which the Compact Desktop slides onto securely. It also has two side holsters for battery packs. HP includes four batteries with the backpack, so you can keep one pair charged while you’re using the other.

Performance and battery life

PCMARK 7 PCMARK 8 (CREATIVE ACCELERATED) 3DMARK 11 3DMARK (SKY DIVER) ATTO (TOP READS/WRITES)
HP Omen Compact Desktop (2.9Ghz-3.9GHz i7-7820HK, NVIDIA GTX 1080 [overclocked]) 7,040 N/A E21,786 / P19,286 / X9,144 34,094 3.1 GB/s / 1.65 GB/s
HP Omen 15 (2.8GHz Intel Core i7-7700HQ, NVIDIA GTX 1060) 6,727 6,436 E14,585 / P11,530 / X4,417 20,659 1.7 GB/s / 704 MB/s
ASUS ROG Zephyrus (2.8GHz Intel Core i7-7700HQ, NVIDIA GTX 1080) 6,030 7,137 E20,000 / P17,017 / X7,793 31,624 3.4 GB/s / 1.64 GB/s
Alienware 15(2.8GHz Intel Core i7-7700HQ, NVIDIA GTX 1070) 6,847 7,100 E17,041 / P16,365 20,812 2.9 GB/s / 0.9 GB/s
Alienware 13(2.8GHz Intel Core i7-7700HQ, NVIDIA GTX 1060) 4,692 4,583 E16,703 / P12,776 24,460 1.78 GB/s / 1.04 GB/s
Razer Blade Pro 2016 (2.6GHz Intel Core i7-6700HQ, NVIDIA GTX 1080) 6,884 6,995 E18,231 / P16,346 27,034 2.75 GB/s / 1.1 GB/s
ASUS ROG Strix GL502VS(2.6GHz Intel Core i7-6700HQ , NVIDIA GTX 1070) 5,132 6,757 E15,335 / P13,985 25,976 2.14 GB/s / 1.2 GB/s
HP Spectre x360 (2016, 2.7GHz Core i7-7500U, Intel HD 620) 5,515 4,354 E2,656 / P1,720 / X444 3,743 1.76 GB/s / 579 MB/s
Lenovo Yoga 910 (2.7GHz Core i7-7500U, 8GB, Intel HD 620) 5,822 4,108 E2,927 / P1,651 / X438 3,869 1.59 GB/s / 313 MB/s
Razer Blade(Fall 2016) (2.7GHz Intel Core-i7-7500U, Intel HD 620) 5,462 3,889 E3,022 / P1,768 4,008 1.05 GB/s / 281 MB/s
Razer Blade (Fall 2016) + Razer Core(2.7GHz Intel Core-i7-7500U, NVIDIA GTX 1080) 5,415 4,335 E11,513 / P11,490 16,763 1.05 GB/s / 281 MB/s
ASUS ZenBook 3(2.7GHz Intel Core-i7-7500U, Intel HD 620) 5,448 3,911 E2,791 / P1,560 3,013 1.67 GB/s / 1.44 GB/s
Razer Blade Stealth(2.5GHz Intel Core i7-6500U, Intel HD 520) 5,131 3,445 E2,788 / P1,599 / X426 3,442 1.5 GB/s / 307 MB/s

The Compact Desktop is actually made out of laptop hardware. It’s powered by an Intel Core i7-7820HK CPU, which is unlocked to make it easier to overclock. It also features NVIDIA’s GTX 1080 notebook GPU (overclocked out of the box) with 8GB of video RAM. Additionally, the Compact Desktop packs in 16GB of DDR4 memory and a 1TB SSD. Clearly, HP didn’t skimp on components — this thing can easily take on full-sized gaming rigs.

In Gears of War 4 running in 4K with high settings, it managed an impressive average framerate of 56FPS. Stepping down to 1,440p, it reached 90FPS with Ultra settings, and 121 FPS in 1080p. Basically, it’ll handle every modern game without trouble. Given its diminutive size, it could also serve as a home theater PC that can blow away the latest 4K ready consoles, like the Xbox One X and PlayStation 4. (Of course, that should be expected when it costs five times as much.)

The desktop also has a built-in battery, which lasted 1 hour and 10 minutes while running the PC Mark 8 benchmark. Clearly, it’s not something you’re meant to use unplugged for very long. It’s still convenient though, since it means you can connect the Compact Desktop to the VR backpack accessory, and swap out extra external batteries, without shutting it down. That’s a simple thing you can’t do with other VR backpack systems.

Image may be NSFW.
Clik here to view.

Using virtual reality

In desktop mode, HP’s mixed reality headset was a cinch to set up. All you have to do is plug in an HDMI and USB cable. There aren’t any sensors to install, like with the Oculus Rift and Vive. Everything on Microsoft’s VR platform relies on built-in sensors, instead.

When you put on the headset, you’re thrown into the Windows Mixed Reality Portal, which is where all of the VR magic happens. You’ll be asked how you want to use your headset: in walking mode, which replicates the room-scale VR we’ve seen on the HTC Vive, or sitting and standing in place. If you choose the latter, you can immediately start moving around Microsoft’s virtual living room and testing out mixed reality apps. If you want to walk around VR environments, though, you’ll need to clear out nearby furniture and trace a virtual boundary using the headset first. It’s all meant to keep you from bumping into your desk and nearby walls.

Image may be NSFW.
Clik here to view.

Compared to the virtual living rooms from Oculus and HTC Vive, which serve as a home base for everything you’re doing in VR, Microsoft’s feels comfortable. And even though there are only 60 virtual reality apps in the Windows Store so far, including notable entries like Superhot and Arizona Sunshine, there’s still plenty to do. Superhot, feels just as smooth and immersive as it did on the HTC Vive. And since the headset is higher res, everything looked sharper as well. Watching trailers in the virtual screening room almost felt like I was in a theater. And it handled 360-degree videos well. The Star Wars Rogue One behind-the-scenes experience felt just as immersive as other headsets.

Microsoft also wisely partnered with Valve to bring SteamVR over to Mixed Reality headsets. Steam automatically recognized HP’s model when I started it up, and I was able to hop into Rez Infinite. After playing for a while, though, it’s clear that Microsoft’s VR controllers aren’t nearly as ergonomic as Oculus’s excellent Touch controllers. Hitting the trigger and grip buttons don’t feel very natural, and since they have straight handles, they don’t fit easily into the natural curve of your hands. Hopefully, that’s something Microsoft can fix with its next controllers.

When it comes to HP’s backpack accessory, I was honestly surprised how much I enjoyed using it. It made diving into VR more immersive, since I didn’t have to worry about getting tangled in any cables tied to a large desktop. Sure, the setup is a bit more involved: You’ve got to attach the compact desktop, slide in the battery packs, and make sure everything is connected properly. The backpack feels surprisingly comfortable to wear, thanks to its padded shoulder straps and two front straps. The entire setup clocks in at 8.3 pounds, which isn’t much more than what I typically lug around every day in my backpack.

While playing Superhot, I was able to dodge bullets and take out bad guys far more easily, since I was free to move and bend in ways I couldn’t with a typical VR setup. Of course, there’s also the danger of hitting a wall and running into furniture. Even if you set up virtual borders, it’s easy to miss those when you’re swept up in the game world. And, oddly enough, you can’t quickly set up new borders in backpack mode — you can only do it in desktop mode with a monitor attached. So while you can conceivably take the entire VR backpack setup anywhere — as I did around our offices — you’re stuck using it without anything to warn you about walls or obstacles.

HP claims each pair of batteries adds one hour of juice to the backpack setup, on top of what you get from its built-in power source. In my testing, 15 minutes of gaming typically used up around 10 percent of battery life. (I wasn’t able to stay in VR long enough to drain the batteries completely.) Of course, that timing will depend on what, exactly, you’re doing. Sitting back and watching a video, or just browsing the web, could stretch the battery life longer.

Image may be NSFW.
Clik here to view.

But, I can’t help but be a VR backpack sceptic. Wireless VR solutions are already here, and they’re only going to get better over the next year. VR backpacks are already an incredibly niche category, but it won’t be too long until they’re completely unnecessary.

If you want to use HP’s headset with the backpack, you’ll also have to pick up a $10 virtual display dongle. That’s due to an issue with Windows, which simply doesn’t spit out an image to mixed reality headsets unless it detects a connected display. While it would make more sense for HP just to include one of those adapters in the box, the company says it’s hoping Microsoft comes up with a software fix instead. Oddly, the backpack setup will work fine with an HTC Vive without that workaround.

Pricing and the competition

Image may be NSFW.
Clik here to view.
You’d really need a lot of extra spending money to dive into HP’s VR ecosystem. The compact desktop costs $2,499, while its mixed reality headset is an additional $449. And don’t forget about the backpack, which adds another $499. It’s so costly, that it’s even out of the realm of many early adopters. It’s more suited for developers who want to explore what’s possible with portable VR.

There are, of course, other VR backpack options on the market, like those from Zotac and MSI. They’re all bigger and clunkier than HP’s system, but at least they’re not as expensive. Zotac’s VR Go starts at $1,800, and it includes both the desktop and backpack accessory. You’ll still need to add your own headset, though.

Wrap-up

Image may be NSFW.
Clik here to view.

Ultimately, the Omen X Compact Desktop’s power and unique capabilities helps it stand out from the gaming crowd. HP’s mixed reality headset, meanwhile, is a solid entry into new territory, one that’s bolstered by Microsoft’s growing VR platform. And even though VR backpacks might not be around for long, and they’re certainly not something most people should consider, HP’s entry remains the best one we’ve seen so far.

(engadget.com, https://goo.gl/B9ysUA)

BenQ GL2580HM Review

Pros

  • Stylish frameless design
  • Surprisingly good image quality
  • Internal power supply

Cons

  • Can get IPS LCD monitors for the same price
  • No DisplayPort input

Key Features

  • Review Price: $150
  • 24.5-inch screen size
  • 1920 x 1080 pixel resolution
  • TN LCD panel
  • Frameless design
  • 250nits maximum brightness

What is the BenQ GL2580HM?

The GL2580HM is a fairly typical entry-level monitor. It measures 24.5-inches from corner to corner, it has a 1080p resolution, and it uses the less desirable TN type of LCD, as opposed to IPS. However, this the reason it costs just £140.

Moreover, unlike many a cheap monitor, with the GL2580HM you get a sleek, “frameless” bezel. It also offers a decent connectivity, plus speakers are included too. If you’re in the market for a budget screen, it’s certainly worth considering.

BenQ GL2580HM – Design and build

The standout feature of the GL2580HM – considering its price – is its narrow bezel. The top and sides of the screen merge with the frame behind, leaving just a slight 1.5mm physical bezel. This significantly ups this monitor’s desk appeal, over a conventionally framed display.

Image may be NSFW.
Clik here to view.

The choice of all-black for the rest of the display works well too. An Apple 5K display this isn’t, but this BenQ is no dumpy, cheap-looking panel either.

However, while the stand and frame may be stylish, they’re not particularly feature-packed. Most notably, the stand offers only a slight back and forth tilt adjustment, with no height or swivel adjustment at all. Instead, those who want a display that they can move around more easily will have to take advantage of the 100 x 100mm VESA mount on the rear.

Image may be NSFW.
Clik here to view.

The BenQ GL2580HM also doesn’t offer anything in the way of extras. There’s no USB hub, no headphone stand and no gaming features – but it’s as you’d expect for the price.
There are a couple of niceties, though. The power supply here is internal, so you’re not lumbered with an annoying power brick.

Connectivity is also decent with one DVI, one HDMI and one VGA port. For anyone with a reasonably modern computer, an extra HDMI or a DisplayPort would have been more useful than the DVI and VGA, but they’ll be handy for backwards compatibility.

Image may be NSFW.
Clik here to view.

On this HM version of the GL2580 you get speakers and an audio input, enabling you to pipe music to the screen when using DVI or VGA. Meanwhile, the GL2580H version drops the speakers and audio input and is around £10 cheaper. Both models include a headphone jack.

For the gamers out there, the GL2580HM doesn’t have too much to shout about. Its TN panel means pixel response time is just 2ms, but the monitor is only rated to refresh at up to 60Hz. However, I was able to get the monitor running at 75Hz by creating a custom resolution in AMD’s drivers, so you can eek out a few extra FPS.

Notably, though, you don’t get FreeSync. This is a feature that can make games look better by reducing image tearing and stutter, and it’s relatively common even on cheap displays.

Image may be NSFW.
Clik here to view.
P1090261-1220x916

BenQ GL2580HM – OSD and setup

The GL2580HM has five buttons on the underside of its frame that are used to control its onscreen display (OSD). These aren’t quite as intuitive as the joystick on the LG 23MP68VQ-P, but are still easy enough to use. Crucially, the buttons are nice and large, so are easy to feel your way around. Also, the layout of the menus means it always feels reasonably natural as to which button you should press next.

You get a comprehensive set of options for setting up the display. All the basics such as brightness and contrast are present, plus you get colour temperature, saturation and gamma settings. These options are often lacking in cheaper displays. Their inclusion here means it’s easy to dial in just the right colour balance.

As for gaming, there’s only a single option available to tweak: Advanced Motion Acceleration, or AMA. This is just another term for overdrive, which is where a higher voltage is applied to each pixel to make it transition faster. It can slightly reduce some of the blurriness associated with LCD screens, although here it seemed to make very little difference.

Image may be NSFW.
Clik here to view.
P1090263-1220x813

BenQ GL2580HM – Image Quality

For a basic TN LCD monitor, this display is astonishingly good. Unlike so many such monitors, the poor viewing angles of the TN technology don’t result in an image that constantly feels like it’s shifting and changing as you move your head by even the slightest amount.

Tilt the screen up and down and you get the tell-tale lightening and darkening of the image, but in normal use it’s hardly noticeable.

What’s more, it appears to produce surprisingly rich colours, there’s plenty of contrast, and the colour balance looks spot on. There’s no red, green or blue tinge to it, and the gradient of colour from light to dark feels right.

Image may be NSFW.
Clik here to view.
P1090251-1220x916

Viewing angles: From below the display looks darker; from above it’s lighter

I fired up my colorimeter and tested the monitor’s colour performance – and, sure enough, it proved rather impressive.

BenQ GL2580HM – Non-calibrated image quality metrics

  • Max Brightness: 265 nits
  • Contrast: 712:1
  • Gamma: 1.99
  • Colour temperature: 6915K
  • Delta E average: 0.14
  • sRGB coverage: 92.2%
  • DCI-P3 coverage: 71.4%

Its colour temperature of 6915K is close enough to the ideal of 6500K that most users would never feel the need to adjust the colour balance. Its colour space coverage is also decent, managing 92.2% of the sRGB colour space. Its Delta E score of 0.14 is excellent. In other words, this monitor is quite capable of producing nearly all the standard computer sRGB colour space and it can pick out all the fine differences between those colours too.

Meanwhile, a gamma score of 1.99 is a little way off the ideal of 2.2, but switching to the next option down in the OSD’s gamma setting soon sorted this out.

Image may be NSFW.
Clik here to view.
P1090264-1220x814

Maximum brightness is a little low at just 265 nits, but this is more of an academic consideration: most of the time, we recommend using a monitor at a much lower brightness.

The only slight letdown is contrast, which is just 712:1. We generally like to see closer to 1000:1, even on basic TN displays. However, it still looks decent, and I only really consider 600:1 and lower to be truly detrimental to image quality.

BenQ GL2580HM – Calibrated image quality metrics

  • Contrast: 782:1
  • Gamma: 2.21
  • Colour temperature: 6463K
  • Delta E average: 0.13
  • sRGB coverage: 91.9%
  • DCI-P3 coverage: 71.4%

All told, aside from changing the gamma setting, this display is ready to roll right out of the box. However, if you do have a colorimeter and can calibrate the display then it can do even better.

Tweaking the colour balance from 100 x 100 x 100 (RGB) to 99 x 100 x 96 pulled the colour temperature to 6460K – just 40K short of the ideal 6500K figure. This also improved contrast to 779:1 and Delta E to 0.13.

Trying the 75Hz refresh rate and Premium AMD setting didn’t revolutionise this display’s gaming performance. However, if gaming is a key concern then this display will do better than similarly priced displays that have IPS LCD panels, or those that are limited to just 60Hz.

Image may be NSFW.
Clik here to view.

Why buy the BenQ GL2580HM?

There’s much to like about this display. It has an attractive design plus it includes a number of practical features, such as its internal power supply, decent selection of connectivity and included speakers. It also offers surprisingly good image quality for a basic TN LCD panel, and generally it feels a step up from the very cheapest 1080p monitors.

Add to this the fact that it can run at 75Hz and it has just a 2ms response time, and it can even offer a slightly better gaming experience than some.

However, it’s also a touch close to the price of other monitors that include more desirable IPS LCD panels, such as the LG 23MP68VQ-P. They also tend to be slightly smaller 23-inch screens, but since they offer the same resolution, they’re much of a muchness.

All told, the BenQ GL2580HM isn’t a display that overly excites – but it isn’t one that will do you wrong either.

Verdict

For a budget 1080p monitor, the BenQ GL2580HM has plenty going for it – even if it doesn’t truly excel at any one thing.

(trustedreviews.com, https://goo.gl/hGMJh4)

 

Alienware AW3418DW review

OUR VERDICT

The Alienware AW3418DW is the fastest gaming monitor with a 21:9 ultrawide aspect ratio on the market today, with an overclocked refresh rate of 120Hz. If you want smooth, high-frame-rate games with an immersive vision-filling aspect ratio, and plenty of game-specific features, the Alienware AW3418DW is an excellent – if pricey – choice.

FOR

  • Fastest ultrawide monitor
  • Large, immersive screen
  • Nice design
  • Excellent features

AGAINST

  • Very expensive
  • Not a huge amount of ports

The Alienware AW3418DW gaming monitor is another ultrawide display that offers immersive gameplay experiences by filling the player’s peripheral vision with an elongated, 21:9 aspect ratio.

While most modern monitors and widescreen TVs have an aspect ratio of 16:9, a 21:9 aspect ratio means the screen is a lot wider on the horizontal plane. This extra visual real estate doesn’t just benefit games, as a wider screen can help productivity as well, allowing you to have far more open windows than usual.

However, with the Alienware brand, it’s no surprise that the AW3418DW is a gaming monitor first and foremost, and it goes up against the likes of the AOC Agon AG352UCG and  LG 34UC79G-B, two fantastic monitors which have made it onto our best gaming monitor list.

So, with some pretty stiff competition, let’s see how the Alienware AW3418DW performs.

Price and availability

Alienware is a brand that’s often associated with premium prices, and with ultrawide gaming monitors rarely coming cheap, it may come as little surprise that the Alienware AW3418DW is a pricey monitor, with a $999 price (£1,100, AU$2,099).

Compared to the AOC Agon AG352UCG’s price of $899 (£799, around AU$1,200), this is a far more expensive proposition. That could be a bit of an issue for the Alienware, as – on paper – the AOC offers a slightly larger screen size (35-inches), along with many of the same features, such as G-Sync (high refresh rates for smoother gameplay) and a 3,440 x 1,440 resolution.

So, the AW3418DW has to prove the extra money is worth it in the performance department.

One thing that the Alienware AW3418DW has that its competitors do not, and which may make the high price tag worthwhile, is its ability to be overclocked to a 120Hz refresh rate, making it faster than its rivals.

Image may be NSFW.
Clik here to view.
b5NjmopyQRnnpiNQp4X3tR-650-80

Design

One of the main things Alienware products are known for (apart from their high prices), are the sleek and eye-catching designs that embrace gamer aesthetics without being too over-the-top and garish. So, colored LEDs, metallic shades and sharp angles are all present and correct, but there’s also a, shall we say, ‘maturity’ about the design that means it’s a good-looking gaming monitor, without offending anyone’s tastes.

Due to the size of the screen, the Alienware AW3418DW comes in a very large box. Luckily, once opened, it is easily put together without the need for any tools. This makes setup much easier, and the hardware still feels nice and sturdy.  Once set up, the screen can be easily adjusted to make it more comfortable to use.

The thin bezels around the screen, especially the top and bottom, means the monitor isn’t taking up wasted space – but, bear in mind that this is a largemonitor, and it will dominate pretty much any desk you put it on. The stand, made from metallic-coated plastic and featuring the Alienware logo in various places, does take up quite a bit of space on the desk as well, more so than the stands we’ve seen on other ultrawide monitors.

Image may be NSFW.
Clik here to view.
whSge4Y7VQspNjYwKrTDtR-650-80

The back of the monitor is silver, with another Alienware ‘alien head’ logo that lights up, and a few more LED strips. Overall, the design here is quite nice and, if you’re not a fan of the LED lighting, you can turn it off using the onscreen menu.

Back to the front, and there’s a simple, yet alluring, line of text in the middle of the bottom bezel that simply says ‘Alienware’. Along the bottom-right side of the screen are the buttons that control the onscreen menu, along with the power button (which is lit up with an LED, which you can change the color of, like the other LED lights on the monitor).

There are no icons or hints of what each button does on the bezel, which keeps things looking neat and tidy. But, that can lead to confusion when trying to use the buttons to go through the menu options. Thankfully, Alienware displays the icons for what each button does when the onscreen menu appears (which happens when any of the buttons are pushed), making things a little easier.

Connection-wise there is an audio-in jack, DisplayPort 1.2, HDMI 1.4, a USB upstream port (for connecting to your PC), and two USB 3.0 ports, allowing the monitor to be used as a USB hub. There are another two USB 3.0 ports on the bottom of the screen, along with a headphone jack. If you extend the screen to the top of the stand, these ports are much easier to access than the ones on the back, so we commend Alienware for the clever placement.

Image may be NSFW.
Clik here to view.
KiE8gma2fqUVjpXTvcZ7qR-650-80

Overall, the Alienware AW3418DW is a very nicely designed gaming monitor that sticks close to the look and feel of other Alienware devices. If you have an Alienware Aurora R6 desktop gaming PC, for example, this monitor will compliment it incredibly well. Even if you don’t own any other Alienware stuff, the design is subtle enough to fit in with PCs and peripherals.

Performance

One of the first things you’ll notice when you load up a game and begin playing on the Alienware AW3418DW, apart from the immersive aspect ratio that fills your vision, is just how gloriously smooth gameplay is. This is thanks to the 100Hz refresh rate and G-Sync technology, which offer high frame rates without screen tearing. The results are remarkable.

For most gamers, going back to a monitor without G-Sync will be a real struggle.

While 100Hz is a higher refresh rate than many standard monitors (which usually max out at 60Hz), it’s not quite the highest refresh rate possible for G-Sync monitors, which are capable of up to 240Hz. However, it’s pretty standard for 21:9 monitors, with the LG 34UC79G-B and AOC Agon AG352UCG both offering 100Hz refresh rates.

The Alienware AW3418DW has an ace up its sleeve, however, with an overclock setting that ups the refresh rate to 120Hz.

Image may be NSFW.
Clik here to view.
tchtmMYgywnzJnnY6gttZR-650-80

The jump from 100Hz to 120Hz is pretty noticeable. When overclocked, the action onscreen becomes incredibly smoother. In playing Wolfenstein II, which maxes out at 100Hz, and Call of Duty: WWII, both at 100Hz and 120Hz – in both games, the performance of the Alienware AW3418DW impresses.

The jump from playing at 60Hz to 100Hz really is impressive, and it’s something that you really need to experience yourself. Everything feels much more smooth and responsive, especially with fast-paced action. Going back from 100Hz to 60Hz can be very jarring, almost like the character you’re playing as is wading through quicksand.

Going up to 120Hz is also noticeable. If you have the hardware to cope – we played with an Nvidia Titan Xp graphics card, allowing us to whack graphical settings to the max while maintaining those high refresh and frame rates – you’ll really see the difference.

We’re not exaggerating when we say this is a game changer.

You can also enable a faster response time using the on screen menu, choosing between ‘Normal’, ‘Fast’ and ‘Super Fast.’ The idea is that the faster the response time, the less lag there is. So, there is less time between the movements you make with your mouse and keyboard and how your character moves on the monitor.

Image may be NSFW.
Clik here to view.
MpxJojJ7g7YbdQwb9QiPA6-650-80

While this is a welcome option to have, we find that putting the response time to ‘Super Fast’ adds an element of ghosting on screen. This manifests itself during games with a faint grey outline during fast action.

We also notice it outside of gaming and during, for example, browsing websites where the outlines appear around pictures and text when scrolling. It’s a minor complaint, and it can be remedied by setting the responsive time to ‘Normal’, or even ‘Fast’. But, for us, the benefits aren’t worth it. The response time of the Alienware AW3418DW (4ms) should be low enough for most people anyway.

As we’d expect from a G-Sync panel, there are no instances of screen tearing when playing. Overall, games look amazing on the screen, and there are enough options for you to tweak the monitor to get your desired performance.

The Alienware AW3418DW also gives you the option of overlaying the frames per second (FPS) count in the top-left hand corner of the screen, which is handy if you want to see how well your hardware is performing. There are enough useful features like this for gamers that help justify the Alienware AW3418DW’s high price tag.

As far as standard image quality goes, the IPS (in-plane switching) panel of the Alienware AW3418DW does an excellent job, though some people may find it a bit bright at its default setting. Luckily, the onscreen menu is easy enough to use that tweaking the settings takes no time at all, and there’s a decent number of pre-set configurations to help as well.

The IPS panel also offers very wide viewing angles, which is pretty essential for an ultrawide monitor (especially a curved one), with color accuracy holding up in every part of the monitor.

Overall, we find the Alienware AW3418DW to be an excellent gaming monitor when it comes to performance, with superb image quality. Also, when overclocked to 120Hz, this is the fastest ultrawide monitor currently on the market. That may just be enough to justify its more expensive price tag compared to its competitors.

Final verdict

There’s a lot to like about the Alienware AW3418DW. It’s large, with the 21:9 aspect ratio really helping to immerse you in your games. This can also give you a competitive edge in that you see more of the battlefield than your competitors.

It’s also fast, with an overclocked refresh rate of 120Hz, making it a speedier ultrawide than any of its competitors. If you’re looking for the fastest 21:9 monitor on the market, there’s only one choice: the Alienware AW3418DW.

G-Sync support (there’s also a version that supports AMD’s competing FreeSync standard), is the icing on the cake, making games even more of a joy to play on this screen.

Then, there’s the design. We’re very taken with the understated, yet still easily identifiable, aesthetics of the Alienware AW3418DW. It’s clearly a gaming monitor, but it doesn’t beat you over the head with a brash design. Instead, it leaves its (excellent) screen to do the talking.

However, there are some aspects of the Alienware AW3418DW that we aren’t too fond of, but luckily these were easily out numbered by what we liked.

The high price tag is obviously one of them. Ultrawide screens are never cheap, but the Alienware AW3418DW is a fair bit more expensive than its competitors. If you’re conscious of you budget, then the AOC Agon AG352UCG is a worthy choice.

However, if you’re willing to pay a premium for the fastest ultrawide monitor on the market, then you’ll want to shell out for the Alienware AW3418DW. You won’t be disappointed.

(techradar.com, https://goo.gl/BjWpvn)

MSI Trident V3 Arctic review: Looks like a console, runs like a high-end desktop

If only this had been around for all our ’90s LAN parties

The MSI Trident V3 is quiet, easily upgraded, and packs a GTX 1070 inside a chassis the size of the Xbox One X. Pretty incredible.

Pros
  • Approximately the size of the Xbox One X/PlayStation 4 Pro
  • Easy access to internal components, and relatively upgradeable
  • Performance similar to a full GTX 1070-equipped tower
Cons
  • Swapping out the hard drive is arduous
  • Accessing the internals voids your warranty
  • Power supply leaves some wiggle room, but it’s not much

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

Smaller form-factor PCs typically go one of two ways: First, you can prioritize the small part of the equation. This leaves you with something beautifully tiny, but at the cost of future upgrades—space-saving comes with the caveat of proprietary and non-replaceable parts. (See: Alienware Alpha.) Or you can prioritize future upgrades, which typically means a larger and less aesthetically pleasing machine.

The MSI Trident V3 is the rare machine that can do both—at least to some extent.

MSI Trident V3 vs. consoles

It really is tiny. Scale can be tough to judge in photographs, but at 13.6 by 9.2 by 2.8 inches, the Trident is so small it’s hard to believe there’s a full-size PC inside. It’s smaller than my launch-version Xbox One for instance, and quite nearly smaller than the new Xbox One X. (It’s smaller depth, but the Trident is about an inch longer and maybe half an inch taller.) The Trident sits comfortably in “console-sized” territory, in any case.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

The Xbox One X (left) and Trident (right), as seen from above.

And it seems even smaller than it is. The Xbox One’s blunt VCR-like chassis looks every inch its size. The Trident’s canted angles are needlessly flashy perhaps, but also slimming.

Speaking of flashy: I could do without the RGB lighting. That’s the one aspect I think detracts from the design, if only because it’s distracting. Given the small size of the Trident I assume most people will place it on something, be it a media center shelf, a desk, whatever. Having an RGB-lit “Y” shape on the front panel ensures the Trident won’t simply blend into the background, instead blink-blink-blinking away at you all night long.

It won’t bother everyone, and it’s also customizable—you can hop into MSI’s settings panel and turn the lighting off, “solving” the problem. There’s just not much point to it being there at all on a machine seemingly so suited for living room use, though.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

It’s not illuminated here, but that odd “Y” shape on the corner is the RGB LED zone.

I love the choice of white for the chassis, though. Most of MSI’s Trident models come in the company’s standard black-and-red color scheme—as “generic gaming machine” as you can get. The model we looked at comes in “Arctic White” though, with a red MSI badge and red labels on the front ports. It’s slick. I tend to prefer black boxes—I feel they hold up better over time—but there’s no denying that fresh out of the box the white Trident is an eye-catcher. Bonus: Less noticeable dust.

The front panel is laden with ports, which also behooves living room use. Most noteworthy is a front-facing HDMI port, which MSI intends for easy VR usage. I haven’t had much reason to use my HTC Vive with the Trident, but I appreciate the gesture. At the moment, plugging in a VR headset means crawling behind my tower PC. Front-facing I/O is certainly more convenient.

You’ll also find 3.5mm headphone and microphone ports on the front, plus one USB-C and two USB 3.1 jacks. And despite the machine’s small size, the rear also features a surprising number of ports—two more HDMI ports on the motherboard itself, five USB ports, gigabit Ethernet, power, Line-In, Line-Out/headphones, and microphone. Then there’s the graphics card, which features an additional two HDMI ports (for a total of five on the machine), two DisplayPorts, and one DVI.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

Wrapping up the design, I’ll note that the Trident also comes with a stand, allowing you to run the machine vertically. This is the weakest part of the package though, with the stand apparently more concerned with aesthetics than keeping your PC intact. The stand neither snaps onto the machine nor screws into it, relying instead on the Trident’s weight and four tiny rubber pads to keep it upright. If you plan to toss the Trident onto a shelf? It might be enough. But if you have an unstable desk, pets, children, or are maybe just clumsy? I wouldn’t recommend running it vertical. Even a moderate nudge could send it teetering and (if you’re unlucky) toppling over.

MSI Trident V3 Arctic specs, price, and performance

Okay, so it’s console-sized. Now how does it stack up? And the answer: Pretty damn good. Actually, the Trident we looked at was loaded, for a machine this small.

Most Trident setups run with an Nvidia GeForce GTX 1060, which is a perfectly serviceable card. (Actually, that puts it about on par with the Xbox One X.) The $1,450 model we looked at though takes the next step, somehow packing an 8GB GTX 1070, plus an Intel Core i7-7700 clocked at 3.6 GHz, 16GB of DDR4 RAM, and both a 256GB SSD and a 1TB hard drive. Again, that’s into a machine smaller than the original Xbox One.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

It’s incredible. Sure, you’re not going to get the same performance as a full tower with a GTX 1080 Ti inside, but in a machine this size? Wow. As I said, even the Xbox One X tops out at GTX 1060 levels of power, which makes the Trident V3 more powerful than even the most powerful console on the market today. We ran the Trident through our usual battery of benchmarks, with impressive results—impressive if only because something this small put up scores similar to full-sized towers.

For instance, in Rise of the Tomb Raider at 1080p with settings on Very High, the Trident V3 averaged 107.9 frames per second. That’s right in line with other 1070-equipped machines like a 1070-equipped Gigabyte PC that we’ve looked at, which averaged 107.8 frames per second. The same goes for Shadow of Mordor with the 4K texture pack installed—130.2 frames per second for the Trident V3, 129.2 for the Gigabyte machine.

The Trident V3 even holds its own in lengthier benchmarks. That’s impressive. In-game benchmarks are usually only a few minutes long at best, so you don’t really see thermal throttling because the hardware doesn’t heat up enough for it to matter. But in one of 3DMark’s lengthier tests or our CPU-centric Handbrake encode you sometimes see heat dispersion problems you might’ve otherwise overlooked.

Not here. In 3DMark’s FireStrike Extreme test the Trident put up a score of 7828, which compares favorably to the Gigabyte machine’s 8313. And in our Handbrake test, where we transcode a 30GB MKV file down to the Android Tablet preset, the Trident V3 did it in about 38 minutes and 46 seconds—only 16 seconds longer than the Gigabyte. All evidence points to there being no significant thermal issues. The machine gets hot for sure, but as far as I can tell it’s not significantly affecting performance.

“Okay, so then noise is an issue right? If it’s moving that much heat, obviously the fans must be distracting.” Nope! Surprisingly it’s both relatively cool and quiet. Not whisper-silent, to be sure, but at even moderate volumes my sound system drowned out the Trident’s fans—and that was with it on my desk. If you placed it across the room, you’d probably never hear it. It’s certainly quieter than 2013’s Xbox One model, and about on par with the new Xbox One X.

MSI Trident V3 upgradeability

But the most important aspect of the Trident V3: It’s not only small, but upgradeable. Properly upgradeable, mostly thanks to the aforementioned GTX 1070. See, most Trident models ship with a GTX 1060—a low-power card. As such, they get away with running on a 230 watt power brick.

That’s fine, as long as you only ever plan to replace the 1060 with an equivalent card, but even upgrading to a 1070 would likely cause problems under load with that diminutive a power supply.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

Since our model ships with a 1070 though, it also packs a 330 watt supply. That’s not much more than the baseline Trident, and it still comes in ugly power brick form so you’ll have to find some extra room to stash it, but you might have a bit of headroom for overclocking if you’re careful. The GPU is also easily accessed, and can be swapped out for any other small form-factor card as long as you keep the power restrictions in mind. Three or four years down the line you should be able to plop in a GTX 1470 or whatever and be good to go.

The GPU isn’t the only part that’s upgradeable though. Two screws gets you into the case, which is enough to replace the RAM immediately. The CPU would take more doing, but you can dismantle the cooling system and swap it out if desired. That would only be in the event of complete failure though—the motherboard is proprietary, and given Intel’s fondness for switching CPU sockets lately you’re probably not going to be able to drop in a new-generation processor later.

Image may be NSFW.
Clik here to view.
MSI Trident V3 Arctic

RAM and CPU on the left, GPU on the right.

I only have a couple complaints. Accessing the hard drives is a royal pain, requiring you to flip the machine over and remove the entire bottom panel. It’d be easier to just use an external drive I guess, but it’s annoying given the Trident’s 1TB drive. First thing I’d want to do is upgrade that, and it’s harder than it needs to be. More annoyingly, you void your warranty by tinkering. That’s not too uncommon with prebuilts, but there is indeed a sticker over one of the screws as you head inside.

If you want to really tinker? Sure, get a full tower or build your own. Still, if you want a small machine that remains a decent investment three or four years down the line? The Trident V3 gives you enough runway to upgrade the most important components a few times, and replace the most likely points of failure too. That’s better than a lot of machines its size (not to mention gaming consoles).

Bottom line

The MSI Trident V3 Arctic offers the power of an upper-mid tier PC in a chassis the size of the new Xbox One X—and with the ability to upgrade it even further in a few years with a hypothetical GTX 1470 or whatever. It’s attractive, it’s small, it’s (mostly) discreet, and it’s also surprisingly inexpensive. The 1070-equipped model we looked at retails for a mere $1,450 on Amazon. Doing some rough back-of-napkin math, I estimate you’d only save maybe $200 or $300 on a bare bones Mini-ITX build of your own, and while you’d gain some additional room for upgrades later it definitely wouldn’t turn out this sleek, nor manage heat this efficiently.

Whether you’re looking for a living room machine, a dorm-room PC, or something convenient to take to those LAN parties you and your friends are still (in 2017!) having, the Trident V3 is definitely worth a look—specifically this GTX 1070 model. The 1060 units? Eh, I’d probably give those a pass. But this is one hell of a deal.

(pcworld.com, https://goo.gl/xRDn3V)

The new iMac Pro (2017) arrives this week — here’s everything you need to know

iMac Pro (2017): Everything you need to know about Apple’s 2017 iMac Pro

The iMac Pro is Apple’s most powerful desktop PC ever and, six months after its announcement, it’s nearly available to buy for the the most demanding Mac users on earth. Here’s everything you need to know about the £4,999 iMac Pro including the latest specs, release date and price details.

We’ve been waiting a while for Apple’s latest iMac Pro, but it’s finally here. Read on for all you need to know about the new 2017 iMac Pro.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho The new iMac Pro (2017) arrives this week — here’s everything you need to know

iMac Pro Release date and Price: When does the new iMac Pro come out?

The December 14 release date lives up to Apple’s promise to release the product before the end of 2017.

That’s significant, given firm has whiffed on a couple of these pledges in the last two years. AirPods were delayed by a few weeks in late 2016, and now HomePod is being held back until early 2018. Apple cited the need for a little extra time to perfect both.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho The new iMac Pro (2017) arrives this week — here’s everything you need to know

Apple has announced its long-awaited, super-powered iMac Pro will go on sale on December 14, meeting its target to get it to power-hungry Mac lovers before the end of 2017,

The all-in-one desktop Mac, was announced way back at WWDC in June and is designed for creative professionals, such as those working in high-end video/photo editing and graphic design.

The remarkably slim iMac Pro, which runs the latest macOS High Sierra software, is also the first Mac to offer support for virtual reality headsets. It’s compatible with the HTC Vice upon launch.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho The new iMac Pro (2017) arrives this week — here’s everything you need to know

iMac Pro Specs: Latest news and details

The iMac Pro starts at $4,999/£4,999, which will get you an 8-core Xeon processor, 32GB of RAM and a 1TB SSD. You’ll also get a Redeon Vega GPU with 8GB of RAM.

The top-end models, which are yet to be priced, will arrive in the new year. That’ll feature an 18-core Xeon processor and an incredible 128GB RAM. The hard drive is 4TB and the Redeon Vega GPU comes with 16GB of RAM.

We’ve also got word the Mac Pro has a new T2 co-processor, building on the T1 chip used to manage the TouchPad and Touch ID within the newer MacBook Pro models.

Seeing as the iMac Pro doesn’t have the TouchPad or Touch ID features, the new-generation T2 silicon will perform different tasks. Cable Sasser, a Mac app developer, has the skinny.

iMac Pro Specs: What do they actually mean?

At its most basic, the iMac Pro is a high-end workstation. This means the lowest-cost modelgets one of Intel’s top-end Xeon CPUs with a minimum of eight CPU cores, all the way up to 18 cores. That’s a monstrous specification, but keep in mind that Xeon processors are more sedate than the rip-roaring 18-core Core i9 chips announced at Computex 2017. Still, for multi-threaded tasks such as rendering and audio production, this thing is an absolute whizzkid.

Image may be NSFW.
Clik here to view.

The graphics department is a real treat as well. In fact, the GPU tech is so cutting-edge that we don’t even know what the exact specification will be. That’s because it’s using AMD’s latest Vega graphics architecture, which isn’t even out yet. It’s toting 11 teraflops (trillion floating point operations per section) of GPU power, which is far and away the most powerful graphics chip ever seen in an all-in-one PC. You’ll get up to 16GB of high-bandwidth memory with your graphics chip, and up to 128GB of ultra-reliable error correcting code (ECC) memory.

All this has been made possible by a new thermal design, which Apple says increases cooling capacity by 80% over the standard iMacs. The SSDs have maximum read speed of 3GB/s. That’s even faster than the storage used on the latest MacBook Pros.

On the back, Apple has added four ThunderBolt 3 connectors for ultra-high-end peripherals, and on the front is a 27-inch 5K display that ticks all the boxes when it comes to colour coverage, including the moviemaker-friendly DCI P3 colour gamut. There’s also 10Gbps Ethernet, a rare thing indeed on computers, and 10 times faster than your standard gigabit Ethernet found on most other PCs.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho The new iMac Pro (2017) arrives this week — here’s everything you need to know

iMac Pro Design: How will the 2017 iMac Pro look?

With all of that power, it’s hardly surprising the 27-inch iMac Pro has a little more girth on the back end, compared to the standard iMacs that have come to resemble flatscreen TVs in recent times.

The iMac Pro also arrives with a new, exclusive colour. It’s Space Gray, matching the options for some MacBooks, iPads and iPhone devices. Accessories like the Magic Keyboard, Mouse and Trackpad.

Image may be NSFW.
Clik here to view.

iMac Pro: What’s in the box?

Interestingly, according to noted YouTuber Marques Brownlee, Apple is also furnishing iMac Pro buyers with a black lightning cable, which are currently unavailable to buy separately.

This matches the ‘space gray’ accessories like the Magic Keyboard, Magic Mouse and Magic Trackpad.

9to5Mac points out the lightning cable is still USB-A at the other end, rather than the USB-C tech that’s also integrated into the MacBook Pro.

Image may be NSFW.
Clik here to view.

(trustedreviews.com, https://goo.gl/M6AZEZ)


iMac: Everything you need to know about Apple’s all-in-one computer

Apple’s iMacs deliver faster processors, better screens, more powerful graphics, and, at long last, Thunderbolt 3.

The iMac is a direct descendant of the very first Mac, and it’s often the computer that longtime users think about when they think about the Macintosh. The iMac’s all-in-one design is popular and iconic.

The iMac is great for both novices and demanding users. It can handle general-purpose and heavy-duty tasks equally well. It’s ideal for someone who needs to buy a complete computer setup (keyboard, mouse or trackpad, and display) and wants to maximize workspace efficiency.

The current iMac lineup was released in June 2017 during Apple’s Worldwide Developers Conference. While the new iMacs might look the same on the outside as its predecesors, they’re quite a bit different inside. It starts with the screen: Apple says the new iMacs have the “best Mac displays ever,” upping the brightness to 500 nits to make them 43 percent brighter while bringing support for a billion colors. You’ll also get faster Intel Kaby Lake processors, higher memory capacity, super-fast storage, and next-generation graphics, as well as a pair of Thunderbolt 3/USB-C ports.

If you’re in the market for an iMac, this guide will help you make the right choice. Apple has two versions of the iMac: the standard iMac and the iMac with Retina display. Apple announced the iMac Pro at WWDC this year, but it won’t be available until December—we’ll update this guide with more info on the iMac Pro once it gets released.

Standard iMac

Apple offers one standard iMac model. It’s priced at $1,099.

Image may be NSFW.
Clik here to view.
Apple iMac

 

Processor, memory, graphics, and storage: The $1,099 iMac has a 2.3GHz dual‑core Intel Core i5 processor, 8GB of memory, and Intel Iris Plus Graphics 640 integrated graphics. The 5,400-rpm drive has a capacity of 1TB.

You can’t upgrade the $1,099 iMac yourself after you buy it, so consider paying an extra $200 at the outset for a memory upgrade to 16GB. If you want to add more RAM later, you need to bring the iMac in to an Apple store. The $1,099 iMac also offers a Fusion Drive or a flash storage upgrade.

Display: The $1,099 iMac has a 21.5-inch display with a resolution of 1920×1080 pixels and can display millions of colors. By comparison, the 21.5-inch iMac with Retina display comes with a screen that has a 4096×2304 display that offers more image detail and can display billions of colors.

The $1,099 iMac’s display is an sRGB display, which is the color space the display uses and is enough for most users. The Retina display uses the P3 color space, which is often used for digital movie projection and the film industry.

Input device: The iMac comes with Apple’s Magic Keyboard and Magic Mouse 2. If you order online from the Apple Store, you can switch the keyboard to a version with a numeric keypad, and switch the mouse to a wired Apple Mouse or a Magic Trackpad 2 ($50). You can opt to get both a Magic Mouse 2 and a Magic Trackpad 2 for $129 extra.

Connectivity: Wi-Fi and Bluetooth are built-in. The iMac has four USB 3.0 ports, two Thunderbolt 3 ports, a gigabit ethernet port, and an SDXC card slot. USB 2.0 devices can connect to the iMac’s USB 3.0 ports.

Speed: The $1,099 is Apple’s slowest iMac. That said, it has enough power for productivity tasks, video and photo editing, and games. The hard drive is a performance bottleneck; if you can upgrade to a Fusion Drive or SSD, you’ll see a significant speed improvement.

Macworld’s buying advice: For new Mac owners, the $1,099 iMac is a good alternative to the Mac mini, providing a nice performance increase. If performance is your top priority, consider a Fusion Drive upgrade. On a 21.5-inch iMac, the 8GB of RAM should be fine, but buying the RAM upgrade at the point of purchase could help you avoid some hassle in the future.

iMac with Retina display

Apple offers two 21.5-inch and three 27-inch models of the iMac with Retina display. Here are the specifications and prices for the five Retina iMac models.

21.5-inch iMac with Retina 4K display
  • $1,299: 3.0GHz quad‑core Intel Core i5 processor, 8GB of RAM, 1TB 5,400-rpm hard drive, and 2GB Radeon Pro 555 graphics
  • $1,499: 3.4GHz quad-core Intel Core i5 processor, 8GB of RAM, a 1TB Fusion Drive, and 4GB Radeon Pro 560 graphics

Image may be NSFW.
Clik here to view.
Apple iMac with Retina Display

iMac with Retina Display

With older 21.5-inch Macs, there was no way to install an upgrade after you bought it, so it was a good idea to add more RAM at the point of purchase. That’s no longer the story with the new 21.5-inch iMac. You can add more RAM later but the upgrade has to be done at an Apple store.

The 21.5-inch iMac with Retina display has a 4096×2304 resolution screen. It uses the P3 color space, which is often used for digital movie projection and the film industry. Also, these screens offer 500 nits of brightness, which is an increase over the screen in older iMacs.

27-inch iMac with Retina 5K display
  • $1,799: 3.4GHz quad‑core Intel Core i5 processor, 1TB Fusion Drive, and 4GB Radeon Pro 570 graphics
  • $1,999: 3.5GHz quad‑core Intel Core i5 processor, 1TB Fusion Drive and 4GB Radeon Pro 575 graphics
  • $2,299: 3.8GHz quad‑core Intel Core i5 processor, 2TB Fusion Drive, and 8GB Radeon Pro 580 graphics

The 27-inch iMac with Retina display has a 5120×2880-resolution screen. Like the 21.5-inch models, it uses the P3color space, which is often used for digital movie projection and the film industry. These screens offer 500 nits of brightness, which is an increase over the screen in older iMacs.

Users can upgrade the RAM on the 27-inch iMac easily. The machine has four RAM slots, accessible through the back. Apple installs the standard 8GB as a pair of 4GB memory modules, so you can add more RAM after you buy the system. Or if you prefer, you can upgrade the RAM at the point of purchase to 16GB ($200) or 32GB ($600).

Input devices: The iMac comes with Apple’s Magic Keyboard and Magic Mouse 2. If you order online from the Apple Store, you can switch the keyboard to a version with a numeric keypad, and switch the mouse to a wired Apple Mouse or a Magic Trackpad 2 ($50). You can opt to get both a Magic Mouse 2 and a Magic Trackpad 2 for $129 extra.

Connectivity: Wi-Fi and Bluetooth are built-in. The iMac has four USB 3.0 ports, two Thunderbolt 3 ports, a gigabit ethernet port, and an SDXC card slot. USB 2.0 devices can connect to the iMac’s USB 3.0 ports.

Speed: The Retina iMacs are among Apple’s fastest computers when it comes to single-core performance. When it comes to multi-core speed, the Mac Pros with more than four cores are faster machines. You can improve the multi-core performance by opting for the 4.2GHz Core i7 upgrade in the $1,999 and $2,299 in the 27-inch models, or the 3.6GHz Core i7 upgrade in the 21.5-inch model—you’ll pay more, but it may be worth it to your for the performance boost.

Macworld’s buying advice: The allure of the Retina display is strong; you’ll love the way it looks. You may not love the way the price looks, however. If you are hesitant about the price, it won’t take long to get over it, once you’ve used the Retina iMac for a couple of weeks.

iMac Pro

The iMac Pro is the computer for people who with the most demanding tasks. It’s targeted at creative professionals, scientists, and software developers.

Specifications: Apple hasn’t specified which processors are in the iMac Pro, but rumor has it that they will be Intel Xeon processors. Apple has stated that the processors will be available with 8, 10, or 18 cores.

At the IFA trade show in August 2017, Intel revealed the Xeon W processor, the company’s new workstation-class CPU. Could this be the processor in the new iMac Pro? It seems likely.

The Xeon W processor is available with 8, 10, or 18 cores, the same as Apple has stated will be available for the iMac Pro. The Xeon W also has support for 2,666MHz DDR4 ECC memory, which the iMac Pro uses. And Apple says the processor in the iMac Pro will have Turbo Boost speeds up to 4.5GHz, which is the same as the Xeon W.

Image may be NSFW.
Clik here to view.
Apple - iMac Pro [2017

iMac Pro

Developer Steve Troughton-Smith discovered a reference to Apple’s A10 Fusion procesor in the BridgeOS 2.0 software used by the iMac Pro. Perhaps the A10 will be used for Siri voice activation, and for the iMac Pro’s boot and security process.

The iMac Pro will comes standard with 32GB of 2666MHz DDR4 ECC memory. You can configure it to 64GB or 128GB. Of note, the RAM is not user upgradeable, so make sure to add enough at purchase.

The storage device is a 1TB SSD, with options for 2TB or 4TB. The graphics card is a Radeon Pro Vega 56 graphics processor with 8GB of HBM2 memory.

Pricing will start at $4,999.

The iMac Pro comes in an aluminum space gray case. It also has matching space gray Magic Keyboard with numeric keyboard and Magic Mouse 2. And Apple is supplying a one-of-a-kind black Lightning cable in the box as well for charging purposes.

Connectivity: Wi-Fi and Bluetooth are included for wireless conenctivity. The back of the iMac Pro has four USB 3 ports, four Thunderbolt 3 ports, a 10 gigabit ethernet jack, and a SDXC card slot.

Speed: This machine will be a multi-processing beast, designed to work with pro-level apps that demand multiple processing cores. Official Apple benchmarks of the iMac Pro are not yet available, butMacRumors recently pointed out that the Geekbench results database has benchmarks for Mac models that seem to fit the specifications of the iMac Pro. These models are identified in the Geekbench database as AAPJ1371,1.

The Geekbench database shows results for an AAPJ1371,1 Mac with a 3GHz Intel Xeon W-2105B processor with 10 cores. It had a Multi-Core Score of 35,917, a huge boost over the 19,336 score of the current high-end iMac, a 27-inch model with a build-to-order 4.2GHz quad-core Core i7 processor.

The Geekbench database also has results of an AAPJ1371,1 Mac with an unidentified 2.4GHz Intel processor with 8 cores. It has two results entries that averaged out to a Multi-Core Score of 23,537.

Macworld’s buying advice: The iMac Pro will release on Dec. 14, and if you want the fastest processing speed available, this is the Mac to get. However, the top-of-the-line configuration with an 18-core processor won’t ship until next year, and there is a new Mac Pro in the works, though you might be waiting for a while.

(macworld.com, https://goo.gl/Chrvzd)

iMac Pro (2017) first look review : Apple’s most powerful Mac is a multiprocessing beast

Made for video editing, 3D graphics, VR, and software development, Apple’s new pro Mac is an engineering feat.

The long-awaited iMac Pro is finally here. On Thursday, December 14, Apple’s new workstation-class Mac became available on the company’s website. The 8- and 10-core iMac Pro ships immediately, while the 14- and 18-core models ship in January.

Image may be NSFW.
Clik here to view.
imac pro particle simulator

The release of the iMac Pro is significant for Apple. The Mac Pro, released in 2013, has had only one minor update in four years, and Apple earlier this year admitted that the machine is a mistake. And while the company could say that a top-of-the-line iMac has plenty of processing power, it’s not the workstation-level computer that demanding professionals want. This machine is a sign that Apple still values pro users and wants to offer a Mac that can meet their demands. During an iMac Pro media event (attended by Macworld), company executives and third-party developers in attendance hammered that point home.

iMac Pro pricing starts at $4,999 for the 8-core baseline configuration. The 10-, 14- and 18-core processors are offered as configure-to-order options, along with RAM (32GB, 64GB, or 128GB), flash storage (1TB, 2TB, or 4TB), and graphics.

Image may be NSFW.
Clik here to view.
imac pro setup

iMac Pro: Processor and graphics

At the heart of the iMac Pro is an Intel Xeon processor. Specifically, the Xeon W, a workstation-class CPU targeted at workstation-class software that uses multiple processing cores. The processors incorporate Advanced Vector Extensions 512 (AVX 512), Intel’s instruction set for 512-bit SIMD (single-instruction, multiple-data) operations.

Apple offers four different processor configurations: 8-core, 10-core, 14-core, and 18-core. Apple considers the 10-core model the one that will appeal to most users, hitting a sweet spot between price and performance; it also offers the highest Turbo Boost frequency of the four models at 4.5GHz. The 14-core iMac Pro was not previously announced as part of the lineup.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho imac pro 2017

Pushing the pixels to the iMac Pro’s display are AMD Radeon Pro Vega graphics. Base configurations come with 8GB Radeon Pro Vega 56 graphics, with an option to upgrade to the 16GB Radeon Pro Vega 64.

Apple reiterated throughout the event that these processors and GPUs were the fastest ever used in a Mac. In software demos of Adobe Dimension CC, Osirix, TwinMotion, Maxon Cinema 4D, high-resolution 3D images rendered on the fly in real time, with barely any noticeable jitter or lag. In VR demos with Gravity Sketch and Servios’ new Electronauts, 3D objects and animation flowed smoothly. Most impressive was a demo of Apple’s Xcode, which ran several UI tests and VMware Fusion virtual machines at the same time without the iMac Pro breaking a sweat.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho imac pro 2017

iMac Pro: Display and design

Just as the iMac Pro’s processor and graphics are the best ever in a Mac, so too is the the display, according to Apple.

The specifications of the display certainly are impressive. Sporting a 5120‑by‑2880 resolution and 500 nits of brightness, the 27-inch Retina display is capable of displaying billions of colors and uses the P3 color gamut. It is not, however, an HDR display. At the event, it wasn’t possible to spend any time really examining the display’s picture quality and performance, and the specs match up with the current 27-inch 5K iMac. Apple’s iMac displays have always been of top quality, and chances are this display will follow suit.

Image may be NSFW.
Clik here to view.
imac pro ports

iMac Pro ports (left to right): headphone, SD card, 4 USB 3 ports, 4 Thunderbolt 3 connectors, 10Gb ethernet.

As for the external design, Apple made a conscious decision to maintain the look of the iMac that we all know; it even has similar dimensions to the iMac. One main difference is that the iMac Pro allows for user–configurable VESA mounting—the consumer iMac must be configured with a VESA mount at the time of purchase. Other differences include the obvious space gray finish (which is quite impressive in person) and rear air vents.

Speaking of the air vents, that brings us to the internal design of the iMac Pro. Though you’ll probably never see the insides, Apple made a great effort to address the cooling needs of this Mac. The iMac Pro doesn’t use a hard drive or a separate solid-state-drive mechanism; all of the flash storage is on the motherboard. This allowed Apple to install a massive heat sink and dual blowers, which Apple says results in 80 percent better cooling than the iMac’s design.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho imac pro 2017

In the software demos I mentioned previously, not once did I notice any fan noise. Not a whirl or the white noise of air blasting through the vents. During the Xcode demo we were encouraged to feel the back of the iMac Pro for a heat check, and it was warm to the touch, but I think I’ve felt more heat from my MacBook Pro.

A design decision that some users won’t agree with is the inability for users to access the iMac Pro’s RAM. Fortunately, the RAM is installed in DIMM slots, not soldered on to the motherboard, so if you have the ability to open up the iMac Pro, you can upgrade the RAM. You don’t have to order more RAM than you need at the onset, but if you want to upgrade later, Apple considers this task one that needs to be done by a service provider.

Image may be NSFW.
Clik here to view.
Kết quả hình ảnh cho imac pro 2017

iMac Pro: New T2 chip for security

In the MacBook Pro with Touch Bar, Apple introduced the T1 chip, which handles processing and display for the Touch Bar and provides the secure enclave for Touch ID. With the iMac Pro, Apple debuts the T2, which controls components and tasks that were once covered by other discrete chips, such as the FaceTime camera, LEDs, and storage devices. The T2 essentially frees the main CPU from these menial tasks so it can focus on serious processing.

The T2 chip also provides a new set of security features. The T2 provides a secure enclave for file encryption (FileVault) and a new startup security feature, which, unfortunately, wasn’t demonstrated at the event. Apple says that the iMac Pro will include a software utility for configuring the secure boot process.

Apple doesn’t usually comment on future products, so the company won’t say if the T2 will be used in other Macs. But if you’ve been following the business side of Apple, you’ve probably heard about Apple’s dealings with third-party chip fabricators and the company’s desire to make their own silicon. It’s a foregone conclusion that we’ll see the T2 or a later generation of the chip in other Macs, it’s just a matter of when.

iMac Pro: Space gray input devices

A space gray iMac Pro wouldn’t be complete without space gray accessories, and Apple includes a Magic Mouse 2 and a Magic Keyboard with Numeric Keypad that properly match the iMac Pro. For $50 more, you can get a space gray Magic Trackpad 2 instead of a mouse, or you can pay an additional $129 to get both.

Color aside, these devices as the same as offered with Apple’s iMacs, so if you hate the flat feel of the keyboard keys, you’ll hate the feel of the space gray keyboard. And I’m not a fan of the Magic Mouse 2, but man, the space gray mouse is gorgeous.

Apple wouldn’t say if these devices will be available for sale separately, but there’s always a possibility if the demand is great enough.

Image may be NSFW.
Clik here to view.
imac pro space gray keyboard

Want the space gray keyboard, mouse, and trackpad without buying an iMac Pro? No doubt you’ll see them on eBay by iMac Pro owners who have other preferred input devices. 

iMac Pro: Should you buy one?

The iMac Pro itself is a beast, both in power and price tag, capable of handling the most difficult processing tasks you can throw at it. If you aren’t sure whether an iMac Pro is the Mac for you, take a look at the reasons why you should or should not buy an iMac Pro. Macworld will do a full review of the iMac Pro once we’ve had time to really put it through its paces.

(macworld.com, https://goo.gl/jZ98yd)

Laptops in 2018: What we can expect

2018 is just around the corner, and with it, we can expect a number of new things to hit all tech gadget categories, one of them being laptops, of course.

The trend as of late, has been packing more and more power into thinner profiles, and we should see a lot more of that, but some laptops in 2018 will bring entirely new things to the table.

Updated Intel processors

Let’s begin with the obvious. Laptops with the latest 8th-generation Intel U-series processors have been rolling out in the last few months, and we can expect them to be standard fare in 2018. The 8th-generation Intel H-series, on the other hand, is still yet to appear, but we’re betting that it will launch next year.

Image may be NSFW.
Clik here to view.

Intel’s 8th-generation processors, codenamed Coffee Lake, is actually just a refresh of Kaby Lake (7th-gen), but are a significant improvement. Not only are clock speeds faster, but you’re also getting two extra cores compared to Kaby Lake. Performance on multi-threaded tasks will now be faster, and you should even see battery life increase a little due to Coffee Lake’s higher efficiency.

New CPU players

It’s not just about Intel though, as team red also has their own stake in the laptop industry. We’ve seen the ASUS ROG Strix GL702ZC, which is powered by the desktop version of AMD Ryzen, but we must not forget that Ryzen Mobile chips with integrated Radeon graphics were launched just a couple of months ago. The ultrabook market, in terms of processors, is currently dominated by Intel, and Ryzen Mobile is AMD’s opportunity to take some ground.

Image may be NSFW.
Clik here to view.

We’ve mentioned the two main processor giants, but Qualcomm definitely deserves a spot on this list. Just last week, the company held their annual Snapdragon Technology Summit, wherein HP and ASUS announced the ENVY x2 and NovaGo, respectively. Both of these upcoming laptops will be actually powered by the Snapdragon 835 mobile platform, which is typically a mobile processor.

Always-on, always-connected

Aside from the Snapdragon 835, the ENVY X2 and NovaGo will also feature a Gigabit Snapdragon X16 LTE modem. The low power, high-efficiency of a chip like the Snapdragon 835 will significantly bump up battery life, and the inclusion of a Gigabit LTE modem brings nano and eSIM support for an always-connected experience.

Image may be NSFW.
Clik here to view.

HP ENVY x2

It doesn’t stop at HP and ASUS’ devices, as Qualcomm also announced that they are teaming up with AMD to bring thin and light notebooks powered by Ryzen Mobile and Snapdragon LTE modems to the market. Expect this trend to flourish.

Increased Thunderbolt 3 adoption

Especially with ultrabooks, Thunderbolt 3 ports that use the USB Type-C standard already exist, but expect even more laptops to include them. It’s a great port to have, as it is a very flexible port. It can serve as a charging port, like with the MacBook Pro or Razer Blade, as well as deliver data and a video signal, all at the same time. You can even hook up an external graphics card enclosure to turn your little ultrabook into a gaming machine.

Image may be NSFW.
Clik here to view.

How is all this done? 40Gbit/s transfer speed. That’s about eight times as much bandwidth as USB 3.0, and translates to about 5 Gigabytes per second, of raw transfer speed.

A number of laptops already have a Thunderbolt 3 Type-C port, but it is far from being an industry standard. It definitely won’t get to that level in 2018, but increased adoption is guaranteed. In fact, we’re already seeing it as LG’s freshly announced Gram laptops for 2018 will include the port.

Thinner, lighter gaming laptops

NVIDIA’s Max-Q design standard was announced at this year’s Computex, and brought us gaming laptops such as the ASUS ROG Zephyrus GX501, MSI GS63VR 7RG Stealth Pro, and Acer Predator Triton 700, which manage to pack a GTX 1070/1080 whilst being less than 18-19mm thick.

Image may be NSFW.
Clik here to view.

ASUS ROG Zephyrus

How is this done? NVIDIA has worked with top manufacturers to come up with a design standard that maintains a power-to-performance ratio that allows the GPU to generate less heat. The less heat generated, the less need there is for a bulky cooling system that will take up a lot of space within the laptop’s chassis. Max-Q doesn’t perform as well as the desktop counterpart, but that’s the trade-off. You’re still getting a GTX 1080-powered laptop that won’t break your back when on-the-go.

We’re not that far off from getting our first look at possibly a few of these things. It’s not just Christmas that’s just around the corner, CES 2018 is very close as well. What are you expecting to see from laptops in 2018?

(yugatech.com, https://goo.gl/TWmLxT)

Sapphire Nitro+ Radeon RX 64 Limited Edition review: Taming Vega’s flaws with brute force

The Sapphire Nitro+ Radeon RX 64 Limited Edition eliminates Vega’s heat and noise issues with the most impressive cooling we’ve seen.

Pros
  • The most effective GPU cooler we’ve tested
  • Very quiet fans
  • Overclocked and built to overclock more
  • GPU support bracket included
Cons
  • Gargantuan size
  • Some coil whine
  • Performs like GTX 1080, priced like GTX 1080 Ti

Image may be NSFW.
Clik here to view.
dsc3167

Four long months after AMD’s flawed Radeon Vega graphics cards launched, custom versions are finally starting to trickle out—and the wait was worth it. The monstrous, yet luxurious Sapphire Nitro+ Radeon RX 64 Limited Edition relies on brute force to squash the most troubling problems that plague reference Vega cards. This beast sports the most wildly effective cooler to ever cross our test bench, and it manages to toss in a factory overclock while providing impressive tools for enthusiasts to push performance even further.

Curing Vega 64’s terrible heat and noise requires tradeoffs, though. Keeping the Sapphire Nitro+ Radeon RX 64 Limited Edition cool and quiet requires an awful lot of metal—and an awful lot of power. Then there’s the cost: At $659, this card is priced closer to a GeForce GTX 1080 Ti than Vega’s traditional rival, the GTX 1080. Is it worth the premium?

Let’s dig in.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 1

Sapphire Nitro+ Radeon RX 64 Limited Edition specs and features

Like most custom graphics cards, the Sapphire Nitro+ Radeon RX 64 Limited Edition puts its own unique spin on things, but shares many of the same tech specs as the reference GPU. Here’s a look at the core Radeon Vega tech specs before we examine Sapphire’s own tweaks.

Image may be NSFW.
Clik here to view.
radeon rx vega 64 specs

Usually we kick off a graphics card review with a discussion about the tech specs but that’s not what’s most noteworthy about this card—instead, it’s the sheer size of the custom cooling solution. It’s hard to tell in these pictures but the Sapphire Nitro+ Radeon RX 64 Limited Edition is an absolute monster of a graphics card, dwarfing traditional two-slot designs in every dimension. This beast goes three slots deep, 4.75-inches wide, and 12.25-inches long. It’s big. By comparison, AMD’s Radeon RX 64 reference card sports a two-slot design that’s 3.75-inches wide and 10.5-inches long.

The Nitro+ Limited Edition is so massive, in fact, that Sapphire ships it with a nicely constructed GPU support bracket to keep the card from sagging in your case. The black nickel-plated bracket doesn’t demand any precious additional PCIe slots, unlike some aftermarket GPU supports, and it complements the look of Sapphire’s Nitro branding. The company plans to sell it separately as well.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 4

The Sapphire Nitro+ Radeon RX 64 Limited Edition is so big, it comes with a support bracket.

Sapphire tells me the Nitro+ Limited Edition was designed to let overclockers push Vega 64 as far as possible, and the card’s construction is proof of that. The Nitro+ Radeon RX 64 Limited Edition returns to a vapor-chamber cooling solution similar to Sapphire’s well-received Vapor-X models during the Radeon R9 200-series era.

That’s bolstered by six nickel-plated heat pipes (three 3mm, three 6mm) that help to keep the GPU and high-bandwidth memory stacks chilly, topped by an impressively gargantuan heat sink. The VRMs on this 14-phase card get a separate chamber of their own with two dedicated 6mm heat pipes and use black-diamond chokes, which Sapphire claims are 10 percent cooler and 25 percent more power efficient than standard chokes.

Like I said: Lots of metal.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 3

Not one, not two, but three large fans sit atop the heatsink. They won’t kick into action until the GPU temperature hits 55 degrees Celsius, and since the Nitro+ Limited Edition doesn’t get anywhere near that hot except during gaming, the card stays utterly silent during normal desktop use, unlike reference Vega cards. The fans support Sapphire’s Fan Check and Quick Connect initiatives, allowing you to check their health in Sapphire’s Trixx utilityand quickly pop out individual fans if one needs replacing.

Sapphire is also introducing “Turbine-X” with the Nitro+ Radeon RX 64 Limited Edition. Turbine-X adds a PWM fan header to the rear end of the card’s custom PCB, similar to what you find on ROG Strix graphics cards that support Asus’ FanControl technology. That header can power up to two case fans, which then take orders from an on-card hardware controller that monitors five temperature sensors on the graphics card’s PCB to ramp fan speeds up and down as needed. Nifty!

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 9

You may have noticed the clear acrylic around the two fans at the card’s extremities. Those house RGB LEDs that can be customized with a new version of Trixx. The Sapphire name on the edge of the Nitro+ Limited Edition glows as well, along with the Nitro logo adorning the card’s full-length backplate. And yes, you can disable the lighting completely if illuminated PC gear gets you grumpy.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 8

The Nitro+ Radeon RX 64 Limited Edition backplate.

Whew! Sapphire sure stuffed this card with cooling potential, and as you’ll see in the performance section later, it pays off when it comes to the biggest flaws in reference Vega 64 cards.

Sapphire ostensibly engineered this for maximum overclocking potential in case you happen to get a golden Vega GPU, but the Nitro+ Radeon RX 64 Limited Edition ships with a decent factory overclock in place, too. Reference Vega 64 cards ship with a 1,274MHz base clock and 1,546MHz boost clock, though those limits can be exceeded if the card stays cool. The Nitro+ Limited Edition offers 1,423MHz base/1,611MHz boost clocks. That’s halfway to liquid-cooled Vega 64 speeds, which top out at 1,677MHz.

The Nitro+ Limited Edition also offers a secondary “efficiency” BIOS, accessible via a switch on the edge of the card. It’s slightly more conservative than the default clock speeds for reference Vega 64 models, with 1,273MHz base/1,529MHz boost speeds. Of course, you can use the Wattman tool in Radeon Software to shift the card between Vega-specific Power Save, Balanced, and Turbo power profiles, too.

Image may be NSFW.
Clik here to view.
dsc3123

We’re gonna need a bigger PSU. 

Speaking of power, you’re going to need an enthusiast-class power supply to run the Nitro+ Radeon RX 64 Limited Edition. Sapphire recommends three 8-pin power connectors and a minimum of an 850-watt PSU. That’s not as excessive as the 1,000W requirement for liquid-cooled Vega 64, but for comparison, Nvidia suggests a 500W PSU for its similarly powerful GeForce GTX 1080, which only requires two power pins. Sapphire’s overbuilding this card a bit though—the third power connector’s mostly there to support high-end overclocking. The card’s power connector circuit includes a fuse protector to keep your hardware investment safe in the event of a power surge.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 2

Sapphire also changed up the port configuration on the Nitro+ Limited Edition to make it more VR-friendly. While the reference models pack a single HDMI port and a trio of DisplayPorts, Sapphire’s card has two HDMI ports and two DisplayPort connections instead.

Okay, now you know everything there is to know about the Sapphire Nitro+ Radeon RX 64 Limited Edition—except for how it performs in action. Moving on!

Our test system

We tested the Sapphire Nitro+ Radeon RX Vega 64 Limited Edition on PCWorld’s dedicated graphics card benchmark system. Our test bed is loaded with high-end components to avoid bottlenecks in other parts of the system and show unfettered graphics performance.

  • Intel’s Core i7-5960X with a Corsair Hydro Series H100i closed-loop water cooler ($110 on Amazon).
  • An Asus X99 Deluxe motherboard.
  • Corsair’s Vengeance LPX DDR4 memory ($205 on Amazon).
  • EVGA Supernova 1000 G3 power supply ($200 on Amazon).
  • A 500GB Samsung 850 EVO SSD ($140 on Amazon).
  • Corsair Crystal Series 570X case, deemed Full Nerd’s favorite case of 2016 ($180 on Amazon).
  • Windows 10 Pro ($180 on Amazon).

We’re comparing the heavily customized Sapphire Nitro+ Radeon RX 64 Limited Edition against AMD’s Vega 64 reference duo, the $499 air-cooled RX Vega 64 and $599 liquid-cooled RX Vega 64. (That’s suggested pricing; in the real world, Vega cards are virtually impossible to find and, when available, their prices are wildly inflated.) All were benchmarked using the Balanced power profile on the stock BIOS, running AMD’s new Radeon Software Adrenalin Edition. To show how the Radeon cards compare against their GeForce counterparts, we’ve also included results from the $500 GTX 1080 Founders Edition and the PNY GTX 1080 Ti XLR8, which cost $735 before going out of stock across the internet. Sapphire’s card is almost the same price, right? We prefer to use reference cards in our reviews, but our GTX 1080 Ti Founders Edition died.

Each game is tested using its in-game benchmark at the mentioned graphics presets, with VSync, frame rate caps, and all GPU vendor-specific technologies—like AMD TressFX, Nvidia GameWorks options, and FreeSync/G-Sync—disabled. Given the capabilities of these particular cards, we’re testing at 1440p and 4K resolutions alone. They’d all scream at 1080p.

Game benchmarks

The Division

The Division ($50 on Amazon) just received a massive overhaul with its 1.8 update, adding large new sections to the map as well as new PvP and PvE modes. It’s a gorgeous third-person shooter/RPG that mixes elements of Destiny and Gears of War, using Ubisoft’s Snowdrop engine. We test the game in DirectX 11 mode.

Image may be NSFW.
Clik here to view.
snp division

Here, we see some trends that’ll hold true over most of these benchmarks. AMD has kept plugging away at its drivers since Vega’s launch, and now even the reference model holds a very slight lead over the GeForce GTX 1080, whereas before it was very slightly behind, by a mere 1fps in August.

All three Radeon RX Vega cards perform within a hair of each other at 4K resolution. Things open up a bit more at 1440p, where the Sapphire Nitro+ Radeon RX 64 Limited Edition comes in a few frames faster than the standard Vega 64, and a few frames slower than the liquid-cooled Vega 64. That makes sense, as the Nitro+ Limited Edition’s maximum clock speeds falls squarely between the speeds of the two reference cards.

None of AMD’s Vega cards, including the $659 Sapphire Nitro+ Radeon RX 64 Limited Edition, comes close to challenging Nvidia’s GeForce GTX 1080 Ti.

Ghost Recon: Wildlands

Next up: Ghost Recon: Wildlands ($60 on Amazon), a stunningly beautiful and notoriously punishing game based on Ubisoft’s Anvil engine. The Ultra graphics settings at 4K absolutely kneecaps graphics cards, so we test at Very High, which “is targeted to high-end hardware.” It’s a game that includes some Nvidia GameWorks features, but again, we test with those disabled.

Image may be NSFW.
Clik here to view.
snp grwl

Vega’s seen some slight frame rate improvements in this game since launch, too. The Nitro+ Limited Edition falls between the Vega 64 air-cooled and liquid-cooled models yet again.

Deus Ex: Mankind Divided

Here’s another graphically punishing game, but this one favors AMD hardware. Deus Ex: Mankind Divided ($60 on Amazon) replaces Hitman in our test suite since its Dawn engine is based upon the Glacier Engine at Hitman’s heart. We dropped all the way down to the High graphics preset for this one, and tested in DirectX 12 alone as that mode performs better on both AMD and Nvidia graphics cards.

Image may be NSFW.
Clik here to view.
snp dx

Rise of the Tomb Raider

Rise of the Tomb Raider ($60 on Steam) tends to perform better on GeForce cards, conversely. It’s another drop-dead gorgeous game.

Image may be NSFW.
Clik here to view.
snp rotr 4k
Image may be NSFW.
Clik here to view.
snp rotr 1440

The Vega trio hangs tight with the GeForce GTX 1080 at 4K resolution, but falls far behind when the resolution’s dialed back to 1440p.

Far Cry Primal

Far Cry Primal ($55 on Amazon) is yet another Ubisoft game, but it’s powered by the latest version of the long-running and well-respected Dunia engine. It performs well on both AMD and Nvidia hardware. We benchmark the game with the optional Ultra HD texture pack enabled for high-end cards like these.

Image may be NSFW.
Clik here to view.
snp fcp
Image may be NSFW.
Clik here to view.
snp fcp 1440

Ashes of the Singularity

Ashes of the Singularity ($40 on Steam), running on Oxide’s custom Nitrous engine, was an early standard-bearer for DirectX 12, and all these years later it’s still the premier game for seeing what next-gen graphics technologies have to offer. We test the game using the High graphics setting, as the wildly strenuous Crazy and Extreme presets aren’t reflective of real-world usage scenarios.

Image may be NSFW.
Clik here to view.
snp aots 4k
Image may be NSFW.
Clik here to view.
snp aots 1440

The Sapphire Nitro+ Radeon RX 64 Limited Edition falls between its siblings yet again. The GTX 1080 destroys AMD’s Vega trio in DirectX 11, but Vega goes toe-to-toe with Nvidia’s card in DirectX 12. If you’re on Windows 10, that means neither side holds a major advantage; but if you’re on Windows 7, which doesn’t support DirectX 12, GeForce is a better buy for Ashes fans.

Power, heat, and noise

Power

We test power under load by plugging the entire system into a Watts Up meter, running the intensive Division benchmark at 4K resolution, and noting the peak power draw. Idle power is measured after sitting on the Windows desktop for three minutes with no extra programs or processes running.

Image may be NSFW.
Clik here to view.
snp power

It’s no secret that Vega draws wildly more power than Nvidia’s GeForce cards. And it’s no surprise that Sapphire’s overclocked, LED-laden, triple-fan Nitro+ Limited Edition sucks down even more juice than the reference cards. But the compromise was worth it, as you’ll see in the next section.

Heat and noise

We test heat during the same intensive Division benchmark at a strenuous 4K resolution, by running SpeedFan in the background and noting the maximum GPU temperature once the run is over. These tests are conducted after first performing numerous benchmark runs so the cards are warmed up.

Image may be NSFW.
Clik here to view.
snp gpu

Look at that temperature. Sapphire’s monstrous cooling system manages to run chillier than the liquid-cooled Vega 64! That’s downright wild (even though AMD should’ve slapped a bigger 240mm radiator on its card). Separate from our formal test, I played Destiny 2 for two hours straight on the Nitro+ Limited Edition at 4K resolution with every graphics option cranked as high as possible, and still the temperature fluctuated between 58 and 59 degrees Celcius. We’ve never seen an air-cooled card run so cold in this test. All that metal pays off.

It’s effective on the noise front, too. The Sapphire Nitro+ Radeon RX 64 Limited Edition’s fans aren’t silent, but they’re damned quiet, especially if you have it in a closed case, and doubly so if you have it in a closed case underneath your desk. (The reason we test graphics cards in traditional cases rather than open-air test benches is to get a feel for real-world use cases.)

Unfortunately, our review sample exhibited a high-pitched coil whine while gaming. The lack of fan noise might have made it more obvious. It’s not so obnoxious that you’d notice it while gaming with music or in-game audio blaring, but you can definitely hear it during menu screens. Interestingly, our liquid-cooled Vega 64 also exhibits coil whine.

Should you buy the Sapphire Nitro+ Radeon RX 64 Limited Edition?

Obnoxious heat and noise levels are the glaring problems with reference Vega 64 graphics cards. AMD’s reference model trades blows with Nvidia’s $500 GeForce GTX 1080 in pure performance, but that doesn’t matter. Just being in the same room as them sucks. Vega’s sky-high power draw is another drawback, but one that some people frankly don’t care about once their PCs are plugged in.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 7

With the Nitro+ Radeon RX 64 Limited Edition, Sapphire rolls up its sleeves, leans into Vega’s massive energy draw, and puts the pedal to the metal to tame those core Vega concerns through sheer brute force. It achieves its goals wonderfully. This hulking beast is the graphics card equivalent of a Hummer. We’ve never seen a high-end GPU hit temperatures this low, and that includes AMD’s liquid-cooled hardware. It’s whisper quiet. It’s incredibly attractive. It’s loaded with extra features. Hell, it even comes with a support bracket to help your GPU stay straight and stylish in your case.

But like a Hummer, all that luxury comes at a steep price. Vega 64 punches in the GTX 1080’s weight class, but at $659, the Nitro+ Radeon RX 64 Limited Edition lurks closer in price to the $700 to $800 GeForce GTX 1080 Ti, and Nvidia’s titan blows it away in sheer performance. Highly reviewed, highly customized GTX 1080 graphics cards like the Asus ROG Strix GTX 1080 can be found for $570 on Newegg, and Nvidia’s GPU is proven to overclock like a champ. Out of the box, the Nitro+ Radeon RX 64 Limited Edition gives you a mild overclock with all the cooling and power management tools you need to crank clocks through the roof manually, but that extra performance isn’t guaranteed. AMD’s Vega architecture isn’t known for having abundant overclocking headroom, either. Even slight speed boosts to Vega result in large power draw leaps.

The Nitro+ Limited Edition’s price will likely climb shortly after the card launches, too. Right now, you can’t find any reasonably priced Vega cards in stock—Newegg’s e-shelves hold a single Vega 64 reference card for a whopping $680 as I write this. And Sapphire isn’t joking around with the Limited Edition tag. It’s not going to eliminate the Nitro+ LE series when the first run sells out, but new inventory will only be made when there’s a free gap in the company’s production schedule.

Image may be NSFW.
Clik here to view.
sapphire radeon vega 64 nitro le 6

All that considered, most people would be better off buying a GeForce GTX 1080 or GTX 1080 Ti, depending on your needs and budget.

Still, there will be some people for whom the Nitro+ LE, high price and all, is the right option. AMD doesn’t charge display makers to use its game-smoothing FreeSync technology, so FreeSync monitors lack the hefty upcharge associated with Nvidia G-Sync screens. If you’ve invested in a high-end 1440p or 4K FreeSync gaming monitor—displays like the 144Hz, 1440p Nixeus EDG 27 ($400 on Amazon) or Samsung’s wild 49-inch FreeSync 2 monitor, the CHG90 ($1,175 on Amazon)—then you need a beastly Radeon card to power it. Playing on a 4K FreeSync monitor with Sapphire’s card proved mighty delicious indeed.

If you’re willing to spend the money to achieve high-end Radeon ecosystem nirvana, then the astonishingly cool, impressively quiet Sapphire Nitro+ Radeon RX 64 Limited Edition will give you the best Vega 64 experience possible. It’s much better than the liquid-cooled Vega 64.

(macworld.com, https://goo.gl/g7tfnv)

Beyond iPhone X: 3 potential uses for Apple’s TrueDepth

Apple is investing almost $400m into the one of the companies responsible for its TrueDepth camera system on the iPhone X, and it’s likely just the first step in rolling the technology out further in its range. The decision to grant millions from its Advanced Manufacturing Fund to Finisar, the US-based manufacturer of lasers used in TrueDepth, is being billed as a win for American industry; however, it’ll also give Finisar the scope to make more advanced versions. As they get smaller, more capable, and more precise, that opens the door to putting the clever camera into new places.

Image may be NSFW.
Clik here to view.
Beyond iPhone X: 3 potential uses for Apple’s TrueDepth

MacBook Pro

This is the obvious one. Once you’ve lived with Face ID on the iPhone X for a while, suddenly one day you catch yourself taking zero-contact biometric security for granted. For me, it was when the Face ID icon flickered briefly on-screen before the iPhone X pulled out my saved passwords and automatically logged me into my favorite websites.

It may sound ungrateful, since we only just got Touch ID on the MacBook Pro, but I’d love to see Apple integrate a TrueDepth camera into the bezel of its next laptop. Now that might be tricky to do: the camera module is, after all, the reason we have the dreaded notch on the iPhone X in the first place. A MacBook Pro lid is even thinner, and it’s going to take some electronics origami to shrink the whole assembly to fit without leaving a bulge.

Nonetheless, the effort will be worth it. A laptop that automatically logs you in when it sees you, keeps the screen awake while it knows you’re looking, and then locks itself again when you walk away. Video conferencing that can magically remove whatever background you’re actually in front of, since it understands depth information too. Not to mention the convenience of accessing saved passwords and accounts without needing to type in codes or stab a finger at the Touch Bar.

iPad Pro

On the fact of it, adding TrueDepth to the iPad Pro is another no-brainer. However, while I’m pretty sure Apple is already doing this – and we might see it as soon as the 2018 iPad Pro refresh – I’m actually hoping the camera isn’t looking at me. Instead, I’d love to see TrueDepth give iPad photography a proper reason for existing.

Image may be NSFW.
Clik here to view.

You’ve probably seen them: people holding up their tablets to snap a shot, and looking fairly ridiculous in the process. However, give the iPad Pro a more capable sensor array and suddenly you’ve unlocked some very interesting depth-perception talents.

For instance, it could allow the iPad Pro to do room-scale mapping, or to scan and digitize 3D objects. We’ve seen add-on cameras that do something along those lines before, and of course Google has Tango, its clever but woefully under-utilized camera tech, but building it into an iPad Pro natively would give things like augmented reality a huge boost. Indeed, it could make the tablet the go-to for AR developers wanting to get ready for Apple’s much-anticipated smart glasses.

Apple Watch

I know what you’re probably thinking: Apple Watch? That tiny little display on your wrist – why would you want a TrueDepth camera there? Here, I’m not talking about the value of Face ID security, it’s the measure of attention that I’m most interested in.

Image may be NSFW.
Clik here to view.

The Apple Watch is a great way to preview information popping up on your iPhone, but in most cases it demands two hands: one held up, since the wearable is strapped to that wrist, and another to tap the swipe the touchscreen, or twiddle the Digital Crown. That’s great, but there are plenty of times I don’t have both hands free. Suddenly, a notification comes in and I find myself trying to touch on-screen controls with the tip of my nose, or figure out how I’m going to read the entire message since I don’t have my other hand free to scroll.

Imagine if, instead, you could navigate the Apple Watch by gaze and attention. That’s well within the TrueDepth camera’s capabilities, with its eye-tracking technology. If it spots I’ve read to the bottom of what’s on-screen it could auto-scroll further down; if my attention is on a certain button, it could assume that’s the command I want to trigger next. All without having to talk to Siri.

Wrap-up

As we’ve seen from other first-generation products from Apple, there’s some clunkiness to go with the cleverness. Nonetheless, the TrueDepth camera looks like it’s here to stay, and Apple’s investment into streamlining production of the complex part – not to mention do important R&D on improving its resolution and decreasing its physical size – telegraphs a real intention to do far, far more with the tiny sensor bar than just allow people to animate emojis with their face.

(slashgear.com, https://goo.gl/prvmd6)

Viewing all 3641 articles
Browse latest View live