Just about every smartphone owner knows the pain of always looking for an outlet; trying to avoid running out of juice for their all-important devices. I know I have a major problem with this; always fighting with the battery indicator on my LG/Google Nexus 5 and often watching helplessly as the device shuts down at the worst possible time. Just last night I was walking on the indoor track at a nearby park’s fitness center, and I needed some music to keep me going. Unfortunately, due to a dead battery, my phone was not in the mood to work out with me.
Surfing on a sofa with a smartphone seems silly
One of the reasons I struggle is that I use my smartphone a lot at home when there are other devices (my Lenovo Yoga 11S hybrid laptop and my first-generation iPad Mini) that are far better suited to the tasks of web browsing, reading and watching videos at home than my smartphone would ever be. The screens are substantially larger (which better for my eyesight), and I get very good battery life from them.
Yet I often lie back on my sofa and hold my smartphone up to surf, read, etc. But why? Because I didn’t want to miss an incoming text message while my phone was plugged in. As great as my Lenovo Yoga and iPad Mini are as devices, I could neither send nor receive my phone’s text messages on them.
There are two parts to AirDroid: a free app for your Android smartphone and a free companion app for your computer (Windows or Mac) or a web-based client. I even used it on my iPad Mini.
Not only can you see these notifications on your computer, you can respond to them on your computer. You can have entire text message conversations without ever touching your phone.
You can even mirror your phone’s screen with a feature called AirMirror. This is quite useful when you want to see an entire text message thread or if you need to operate a specific app on your phone that doesn’t have an equivalent on your PC. Because I have a Nexus 5, which is always kept up to date with the very latest version of Android, the AirMirror feature is still catching up to my phone. So I haven’t really gotten to test that out yet.
I don’t see myself using it much, but AirDroid also offers convenient wireless file transfers from your computer to your phone.
If you have more Android smartphone users than computers in your household, you might want to pony up $19.99 per year for the premium version of AirDroid, which can support up to six smartphones on a single computer along with some other nifty features that I will probably never use.
Don’t feel left out if you don’t use Android
AirDroid, as it names suggests, is only for Android smartphones. That means no support for iPhones, BlackBerry phones, or Windows phones.
If you have any two of the following: iPhone, iPad or Mac computer, you may be able to use Continuity, a new feature that integrates iOS 8 and Mac OS X Yosemite so you can send text messages, make iPhone calls, and mirror your screen right from your Mac or iPad. It’s essentially the same as AirDroid but for the Apple-centric crowd.
For other users, I’d like to recommend PushBullet instead. PushBullet is like a scaled down version of AirDroid, but it does the most important thing equally well: sending and receiving text messages. PushBullet is compatible with Android, iOS, BlackBerry OS and Windows Phone.
PushBullet can also work as a browser extension for Google Chrome, Mozilla Firefox, Opera and (coming soon) Safari. The Chrome browser support is indispensable for people who use Chromebooks instead of traditional Windows or Mac computers.
The Big A-Ha
Since you’re not touching your phone while you use one of these apps, you can charge it even in another room and leave the screen off. For me, the screen is always my #1 battery life culprit, and even using the screen while your phone is plugged in prevents it from charging as quickly as it otherwise could.
I can operate my laptop while plugged in even if the battery is totally dead. Some people even operate their laptops plugged in with the battery removed. My smartphone provides me with no such luxury; it won’t even power on while plugged in unless there is enough power in the battery to run it.
Better yet, since my phone is free to charge with the screen off, it’s far more likely to be fully charged when I really need it: away from home and away from power outlets. And that’s a very big deal.
Have you ever had to use rabbit ears to pick up a local TV signal? I have, and I’d say most of us have as well at some point in our lives.
Quite frankly, it sucks.
Last year, I embarked on a cord-cutting experiment in order to save the money I had been spending on cable television. It is easier now than ever with online services like Netflix to provide affordable content and DTV broadcasts.
Digital Television (DTV) is an advanced broadcasting technology that has transformed the television viewing experience. DTV enables broadcasters to offer television with better picture and sound quality, and multiple channels of programming. Since June 13, 2009, full-power television stations nationwide have been required to broadcast exclusively in a digital format.
Federal Communications Commission
Despite these technological advancements and my modern flat-panel HDTV purchased new in 2010, my experiment failed miserably.
Apart from missing the programming from certain cable channels that could not be substituted online (I’m looking at you, ESPN), the most annoying thing was having to constantly adjust my antenna to get a good picture only to have it flake out on me a minute or two later. An omnidirectional antenna was still just as flaky, and I couldn’t even adjust it.
I live in the city — within 10 miles of all the local broadcast transmitters, in fact — so it wasn’t a matter of distance. I bought a newer amplified antenna that was a step above the basic, so there was no good reason why I should have trouble. Of course, in the city, there are a lot of tall buildings around that can interfere, but there is nothing especially tall between my apartment and any of the transmitters. I even contacted one of the local broadcast stations and told the engineer where I lived — he said something to the effect of “From where you are, you should be able to get a clear picture with a paper clip.”
Very frustrating. So frustrating, in fact, that I decided to pony up the cash and reinstate my cable TV subscription. There just seemed to be an inevitability about paying for television, and to me that payment was worth avoiding a lot of frustration.
Since that time, a few things have happened that lead me to believe this won’t be so inevitable for long.
Broadcaster/network/provider contract disputes. This isn’t exactly a new phenomenon, but it’s definitely intensifying. Here in Indianapolis, the parent company of local CBS affiliate WISH-TV refused to pay the CBS Network more money for broadcasting rights to network programming, so CBS just found another local station (WTTV-TV) whose parent company was willing to pony up. On a related note, our local NBC affiliate (WTHR-TV) got involved in a contract dispute with DirecTV that led to a brief discontinuation of NBC programming for DirecTV subscribers. The dispute was resolved, presumably with DirecTV paying WTHR more money…and those subscribers ultimately paying higher bills for the privilege. Presumably the pressure from WTHR to insist on more money came from having to pay higher broadcasting rights fees to NBC. And the pressure on both the NBC and CBS networks to charge the affiliates higher rates is coming largely from sports — especially the NFL. It was no coincidence that DirecTV and WTHR resolved the dispute just in time to get an important Sunday Night Football game involving the Indianapolis Colts back on the air.
Broadcast networks offering online subscriptions. In addition to Netflix, television networks have increasingly begun to post programming online…it’s a good way to catch up on missed episodes and the like. But CBS recently made a much bolder move: offering online-only paid subscriptions to live television and archived episodes. If the other major broadcast networks follow suit, then that will enable them to completely bypass the local affiliates and go directly to viewers.
Local news online. Those affiliates are even beginning to broadcast their local news shows on a live online stream as an alternative to over-the-air or cable/satellite.
Easier ways to get online programming onto the TV screen. One of my favorite gadgets is Google’s Chromecast. It’s small, it only costs $35, and it can use your WiFi connection to beam whatever is on your smart phone, tablet or on the Chrome browser on your PC to your TV with the touch of a button. I have one plugged into one of my TV’s HDMI ports, and I use it frequently. Now that the Chromecast device has been out for over a year, it has dramatically improved in terms of app support and full-screen casting. Dead simple. And even local affiliate stations are coming out with their own apps.
So, that begs the question, why do we need to broadcast television over the air at all anymore? The obstacles are more on the business side than the technology side. Here’s where the FCC needs to be bold (but probably won’t be).
Preserve net neutrality. Video content takes up a large amount of bandwidth (especially high-definition and 4K video), so it’s important not to have the telecom companies create slow and fast lanes. After all, the concepts that I’m talking about that could save costs for consumers are a major threat to their cash cow business. Right now, the FCC is taking public comments on this issue — so make your voice heard!
Mandate live, free Internet streaming for all over-the-air broadcasts. By law, television stations are supposed to be operating in the public’s interest. As a condition of having a broadcast frequency, the FCC could also mandate that the affiliates provide the same content over the Internet. At the very least, they should have to stream the locally produced content like the local news. This should not be too difficult a hurdle because a lot of stations are already doing this. Now that we have made the full transition to DTV, all of the video content is already digital anyway. But ideally they should also have to stream the network content too for people within their broadcasting area. The programming is free by antenna, so why shouldn’t it be free online?
Complete the National Broadband Plan. The FCC is already hard at work implementing the National Broadband Plan, which should dramatically expand high-speed Internet access across the nation. Now that the DTV transition is complete, the older analog television frequencies have gone to emergency response, and the remainder will be auctioned off — presumably to telecommunications companies. The FCC estimates that there are about 7 million households currently without any access to broadband Internet at any price because they are located in sparsely populated areas where telecommunications companies could not expect much of a return on their infrastructure investments. Of course, having broadband access availablefor a price does not equate to actually having broadband access.
Auction off the DTV spectrum. Just as the FCC is auctioning off the portion of the analog TV spectrum not being used by emergency responders, it could also raise funds by auctioning off the DTV spectrum and using those funds to help subsidize broadband Internet access for those who cannot afford to pay for it. Ideally, this would be revenue-neutral, just like the National Broadband Plan is. With a mandate already in place to live stream all broadcast content, local TV stations would not need to change much.
Hopefully by this point in time — let’s say 10 to 15 years into the future — almost no one would still be using a TV without at least an HDMI port. And we already have lots of cheap Internet-based devices today like the Chromecast or Roku that could simply have buttons for local channels right next to their Netflix buttons. It would not be a huge leap. Much like the FCC created a coupon program for DTV converters, they could create something similar for an Internet-based device like this.
Television stations, of course, make their money from advertising, and the amount advertisers are willing to pay is driven by Nielsen ratings. But even Nielsen has said that Internet-based devices have reduced overall television viewership. Of course, you can still measure the number of hits a video receives (see everything on YouTube), and online video advertising is quite common these days. Getting the type of demographic information that Nielsen measures is a little harder than that, but having users complete a web-based questionnaire is a lot easier than having Nielsen install boxes in people’s homes. The Internet provides a far larger sample size to measure all the hits — not just a select few to extrapolate from.
What do you think? What other obstacles might there be to permanently throwing away the rabbit ears and the huge rooftop antennas?
Note: If you are not familiar with basic photography terms like exposure, sensor, aperture, depth of field, shutter speed, ISO and focal length, this article will make a lot more sense after you read a primer on photography. I recommend Lesson #1 and Lesson #2 of the Photo Basics Series. Really, the whole series is worth a read if you’re a beginner.
Just as the advent of digital photography killed the film business, the advent of smartphone cameras is taking a major bite out of demand for dedicated consumer cameras. Smartphone cameras have gotten so good, in fact, that the best ones can replace entry-level “point-and-shoot” cameras.
Enthusiast and professional photographers seem to be undeterred from buying and upgrading high-end gear, but the mass market is shifting.
What went wrong?
For years, digital camera manufacturers sold new point-and-shoot models in droves and then convinced consumers that they could take much better pictures if they only upgraded to a new model with a longer zoom lens and more megapixels. For example, the Nikon Coolpix P600 has a 16-megapixel sensor and a 60X zoom lens.
While there were indeed some legitimate improvements in consumer camera technology during that period — the move from CCD-based to CMOS-based sensors led to huge gains in low light performance — consumers would need to dive deep into the spec sheet to notice gains in ISO flexibility. The megapixel counts and the zoom ranges, on the other hand, were printed right on the camera itself.
Yet these were exactly the wrong things for consumers to key in on when buying a camera. I’ve already written about the pitfalls of ever-increasing megapixel counts on consumer cameras, including smartphones. But now I want to talk about why you probably don’t want a point-and-shoot camera with a 60X zoom lens.
What you should know about super-telephoto lenses
We’ve all seen the professional sports and wildlife photographers standing behind their massive telephoto lenses mounted on monopods, and maybe we wanted to be just like them.
A pro like this is most likely operating a DSLR camera with a full-frame sensor (24 x 36mm) to match the image size on a roll of 35mm film, so just to keep things consistent, I’ll talk about lenses in terms of full-frame equivalency.
If a professional wildlife photographer with a Canon camera wanted a really long telephoto lens, he or she might buy this one: the Canon EF 600mm f4 IS II USM. Purchase price: $11,999.00. The lens is a massive 17.6 inches long and weighs in at 8.64 pounds, and the focal length is fixed at 600mm…no zooming.
Let’s compare that to the 60X lens on the Nikon Coolpix P600: when fully extended, it reaches out to 1440mm of focal length in 35mm terms: that’s more than twice what the long, $12,000 pro lens can do, and of course it’s much smaller and covers the whole focal range below 1440mm all the way down to 24mm while being much smaller.
Sounds great, right? A quasi-telescope that fits in your hand and takes pictures…all for under $400. Unfortunately, all that glitters is not gold, and there’s a reason that the pros buy huge $12,000 telephoto prime lenses to mount on their $7,000 full-frame DSLR camera bodies.
For the best possible image quality, a prime lens — that is, a lens with a fixed focal length and no zoom — is the way for a photographer with an interchangeable lens camera to go. Of course, sometimes pro photographers are willing to trade off a bit of image quality for the focal length versatility of a zoom lens. For example, the Canon 70-200mm f2.8L USM lens. If you divide 200 by 70, you get 2.86. That’s not even a 3X optical zoom, and yet the sticker price is $1,449.00.
So why doesn’t it cover a longer focal length range? There are three reasons:
Size. This lens is already 7.6 inches long and weighs 2.86 pounds. Increasing the focal length would make the lens nearly unbearable to carry. Lenses attached to cameras with full-frame sensors have to be wide, and the focal length is dependent on the distance of the lens from the sensor as well as the width. That’s why point-and-shoot cameras with smaller sensors can pack on very long lenses while still being able to fit in your hand…if the sensor is small, the lens does not have to be as wide to cover the sensor.
Aperture. There are zoom lenses available for full-frame SLRs that are cheaper and smaller than that one with longer focal ranges, but that is because they have narrow apertures. For example, the Canon EF 70-300mm f4-5.6 IS USM lens is only 5.6 inches long, weighs 1.39 pounds and costs “just” $649. Because of the laws of physics and light, the wide f2.8 aperture makes the 70-200mm lens even larger at a given focal length. (You may sometimes hear photographers refer a lens being “fast” or “slow.” A lens with a wide aperture like f2.8 across all focal lengths is considered to be “fast” because it’s more accommodating for fast shutter speeds.)
Barrel distortion. Moving from a prime lens to a 2.86x zoom like this is already a tradeoff because zooms inherently distort the picture a little as compared to prime lenses. The longer the zoom range of a lens from the wide end to the telephoto end, the more the image is distorted, generally speaking. By distortion, I mean the image is actually bent out of shape a little. Some lens manufacturers, most notably Tamron, manufacture “all-in-one” zoom lenses for people (usually amateurs just starting out with an entry-level DSLR) willing to make this distortion tradeoff in exchange for not having to carry multiple zoom lenses, but even those compromised lenses tend to hover around the 10X to 12X range. And you never see these all-in-one lenses with constant fast apertures like f2.8 because they would have to be absolutely massive.
So, circling back to the Nikon P600, how good is that long zoom lens? Not very. First of all, the maximum aperture ranges from f3.3 at the wide end to f6.3 at the telephoto end. That f6.3 is a killer in low-light situations, and with a compact camera, you can’t raise the ISO much to compensate because your images will get noisy very quickly. And barrel distortion rears its very ugly head toward the wide end of the focal length range…turning your nice, straight vertical lines into slanted lines to the point where you might be reminded of the opening text crawl from the Star Wars films.
Back to that huge Canon 600mm f4 lens…there’s also a reason (besides just keeping it from tipping over) that photographers often mount super-telephoto lenses on monopods or tripods. It’s to keep the lens from shaking while taking a photo. At wide, normal, and even short telephoto focal lengths it’s not much of a concern, but in the super-telephoto range, a little shake of your hand can totally ruin your image. So if the professionals need help stabilizing a 600mm lens, do you really think you can keep your hands steady enough to shoot reliably at a 1440mm focal length without a tripod or monopod? (And, yes, that professional lens has image stabilization — hence the IS in the name.)
Fortunately, there are some newer fixed-lens cameras on the market that can really help you take better photos with a viewfinder and plenty of zoom…by emphasizing more important specs than megapixel count or extreme zoom range. Unfortunately, they do tend to cost more than more ordinary megazooms like the Nikon P600.
Panasonic DMC-FZ200 (MSRP $599) offers a somewhat more reasonable 24X zoom (25-600mm in full-frame terms) with a constant bright f2.8 aperture across the focal length range. Because this is still a point-and-shoot camera with a small, 1/2.3″ sensor, you should probably consider anything above ISO 800 as a last resort due to noise concerns. This becomes especially important when you’re trying to capture fast-moving subjects at long distances, like sports or wildlife because those require fast shutter speeds in order to avoid motion blur. So having an extra stop or two of light at the telephoto end can make a big difference — and the f2.8 aperture delivers. Even at the reduced focal length of 24X, you still get to the super-telephoto length of 600mm. Barrel distortion would still be a significant issue with a 24X lens, just not as much as it would be with a 60X lens.
At an MSRP of $899, the Panasonic DMC-FZ1000 is the FZ200’s big brother. Instead of a 1/2.3″ sensor, it has a larger 1″ sensor for improved image quality and cleaner photos at higher ISO settings. A larger, wider sensor requires a larger lens for the same focal length and aperture, so this one only manages a 16X zoom (25-400mm focal length range) with an f2.8-4 aperture across the focal length range. A 400mm zoom is still a lot — this is typical for the long lenses you see on the sidelines of soccer or football games, and f4 is a full stop brighter than f5.6, so it’s still somewhat “fast.”
The excellent Sony Cyber-shot DSC-RX10 has dropped from its original sticker price of $1,299.00 to a slightly less shocking $999. Like the Panasonic DMC-FZ1000, the Sony RX10 has a 1″ sensor. The RX10 has an 8.3X zoom, which provides a focal range of 24-200mm and a constant aperture of f2.8 across the focal range. The shorter zoom may cause you to miss out on some distant shots, but the photos you can get should look great. Another advantage of cameras with larger sensors and wide-aperture zoom lenses is that you can get shallow depth-of-field effects so you can have those dreamy defocused backgrounds with the foreground in focus.
The Olympus Stylus 1 (MSRP $599) falls somewhere in between the Panasonic FZ200 and the Sony RX10. It has a 1/1.7″ sensor, which is larger than the 1/2.3″ sensor in the FZ200 but smaller than the 1″ sensor in the RX10 and FZ1000. It also has a 10.7X zoom with a constant f2.8 aperture across the 28-300mm focal range.
Cameras like these will inevitably fall in price as technology improves, and that’s good for everyone. Right now, for about the same price, you can also get an entry-level DSLR or mirrorless interchangeable lens camera and maybe even a 2-lens kit for long zooming. Changing lenses isn’t all that difficult as long as you don’t mind carrying two lenses with you. You won’t get f2.8 lenses anywhere near this price range, but with big DSLR sensors, you can dial up the ISO with a lot less of a noise penalty than with a point-and-shoot camera, so you may not need such a fast lens. Those are for the professionals.
Just remember, that when it comes to megapixels and zoom, more isn’t always better.
If you’re a pro photographer, especially a pro photojournalist, sports or wildlife photographer, the camera bodies of choice today are the Nikon D4S (MSRP $6,499.95) and the Canon EOS-1D X (MSRP $6,799.00). Not for the faint of wallet, especially with pricey full-frame lenses to buy on top of that.
And even if you have the budget for cameras like these, they are also bulky and heavy. The D4S body alone weighs 1,180 grams (about 2.6 pounds), and the EOS-1DX body alone weighs in at a whopping 1,530 grams (just under 3.4 pounds). Add on a battery grip, a big full-frame telephoto lens and a bag with a few additional full-frame lenses, and you’re looking at a serious backache.
So why do professionals subject themselves to all that expense, bulk and weight? Because these cameras are the best tools available for their needs. Here are a few reasons why:
Full-frame sensors. Full-frame image sensors (that is, sensors the same size as 35mm film) provide lots of advantages in terms of shallow depth of field effects and image quality at high sensitivity (ISO) settings. Quite simply, bigger sensors allow more light to come in. A full-frame sensor (depicted in orange in the image below) is 36mm x 24mm.
Fast burst rates. The D4S can shoot continuously at 11 frames per second, which is very fast, and the EOS-1D X can shoot even faster at 14 frames per second.
Fast, accurate autofocus. Fast burst rates are meaningless without accurate autofocus, especially for fast-moving subjects like soccer players or birds in flight. You will just end up with a whole lot of blurry images. So these DSLR cameras have phase detect autofocus sensors.
Optical viewfinders. Those who use these cameras in the field can’t afford to miss a moment to capture the perfect shot. That’s one reason they tend to prefer optical viewfinders, because “Live View” on an electronic viewfinder is not truly live — electronic viewfinders are slightly delayed because the camera has to process the image coming in through the sensor as well as any exposure adjustments before the image appears in the viewfinder. For most photography work, the delay is not significant enough to matter, but for these users, it might be the difference between capturing the shot and missing it. Also, most electronic viewfinders do not display the entire frame.
Weather sealing. Field work for sports, wildlife and general photojournalism can happen in all sorts of conditions and elements. I experienced this myself back in September 2002 when I was working as a reporter for The Mooresville-Decatur Times, and a tornado struck the town. After the tornados had passed, I had to go out and photograph the storm damage, and it was still raining quite a bit in the aftermath. It was a good thing the Nikon D1 I was using was weather sealed.
Fast ergonomics. These cameras are loaded with physical buttons that allow the photographer to change settings on the fly without having to rely on menus. The deep grips also help keep the camera steady in the photographer’s hand. Ergonomics might not be such a big deal for amateurs, but professionals who take very large numbers of photos and spend a lot of time holding a camera need a camera that “feels” right and doesn’t require navigating a lot of menus to change settings
It has a burst rate of 12 frames per second, has a deep grip and lots of external dials just like a professional DSLR, a new “Depth from Defocus” autofocus system that is fast and accurate for tracking moving subjects, weather sealing and more. And, by the way, the MSRP is $1,699.99, and it only weights 560 grams (about 1.2 pounds).
Your fancy 1080p HDTV only has about 2 megapixels of resolution
Not only that, but it does something significant that the flagships from Canon and Nikon can’t: it shoots 4K ultra high-definition video. But why is this important to still photographers?
Video is, fundamentally, a series of photographs…one in each frame. And, at a minimum, video is shot at 24 frames per second.
Most modern high-definition cameras shoot 1080p video with a resolution of 1920 x 1080 pixels. If you multiply 1920 by 1080, you get approximately 2 million, or about 2 megapixels. So each image in the frame would have a very low resolution in terms of photography and would only be usable for making very small prints: a 4″ x 6″ at 300 dots per inch. Larger prints are possible, but the quality becomes degraded.
But 4K video is shot at 3840 x 2160 pixels. Multiply 3840 by 2160, and you get more than 8 million, or more than 8 megapixels. That’s enough resolution for an excellent 8″ x 12″ print — larger than a full page in National Geographic or Sports Illustrated. Cinema 4K is shot at 4096 x 2160, but that’s a strange aspect ratio for still photos.
Unless the photographer needs an even larger print than 8″ x 12″ and cannot fudge on print resolution at all (an 8-megapixel photo could cover a 2-page spread in National Geographic at 200 dpi instead of 300, and that’s still considered “good” resolution), then the GH4 offers the ability to capture very printable action photos at 24 frames per second.
So why isn’t everyone jumping on the GH4 bandwagon? There are a few drawbacks to the GH4, but they don’t seem to be enough to make up for the price and weight difference or the potential for 24 fps photography.
Sensor size. The GH4 uses a Four Thirds sensor, which is considerably smaller than full frame: 18mm x 13.5mm, depicted in lime green above. There are some compromises in terms of depth of field and image quality at very high ISOs, but the GH4 produces images that are quite usable for 4K video up to ISO 3200, and usable up to ISO 6400 at smaller sizes, including 1080p video. Sure, a full-frame sensor provides super-shallow depth of field effects, but that’s more important for portraiture than for sports or wildlife photography. You can still get somewhat shallow depth of field with Micro Four Thirds cameras. And, since the sensor is smaller, the lenses can be a lot smaller and lighter too. For situations when you really need full frame image quality and 4K video, you can buy a Sony A7S for $2,499.99 in addition to the GH4 and still spend less and bear less weight than you would on a D4S or EOS 1D X. (Unfortunately the Sony A7S is not weather sealed, and its burst rate for stills is only 5 fps, or else I would recommend it instead of the GH4.)
No optical viewfinder. The biggest reason why the Panasonic GH4 and the Sony A7S are so light is that they do not have the pentamirror mechanism found in traditional DSLR cameras. And that means no optical viewfinder — only an electronic “Live View” viewfinder. But electronic viewfinders are not what they used to be — the electronic viewfinders on these cameras have so little lag that it’s “nigh imperceptible.” Both electronic viewfinders also cover 100% of the frame, which is an improvement over EVFs of the recent past. And the advantage of Live View is that you can see the results of exposure adjustments on the fly before shooting. EVF latency will never be zero, but with the GH4 and cameras like it, it’s awfully close.
Limited lens choices. Panasonic has the basics down in terms of professional lenses with its 12-35mm (24-70mm equivalent) and 35-100mm (70-200mm equivalent) weather-sealed, stabilized f2.8 zoom lenses. There are also some tremendous prime lenses in Panasonic’s lineup to get most of the shallow depth of field effects that you can find in a full-frame camera…including a 15mm (30mm equivalent) f1.7, a 25mm (50mm equivalent) f1.4 that I personally own and love, and 42.5mm (85mm equivalent) f1.2 portrait lens. Plus, Olympus has some great lenses of its own in the Micro Four Thirds lineup, with more pro lenses on the way. Canon and Nikon still have the edge in terms of lens selection, but a professional can build a fairly complete Micro Four Thirds lens kit at much lower prices and with much less weight than with full-frame Canon or Nikon lenses. (And maybe, just maybe, the excellent Olympus OM-D E-M1 will get 4K video as well…if not, we know Olympus is getting into the 4K game soon.)
Will this be enough to pry the Nikon D4S or the Canon EOS-1D X out of a professional photojournalist’s arthritic hand? Or will the next-generation pro DSLRs just start shooting 4K video too? Will we start to see more 8K video (with each frame being a 32-megapixel photo) or even higher resolutions? Memory cards that can hold terabytes of ultra HD video? Will the DSLR video revolution reverse course and place camcorders in every photographer’s hand instead of still cameras in every videographer’s hand? Will photo and video editing software converge to help photographers sift through 24 photos for every single second they were covering an event to find that one perfect image for publication? Smart phones that take professional-quality photos? It’s certainly an exciting time for photography.
I am something of a geek. Now I know this will shock and confuse many of you who saw me as a James Dean, Rebel-Without-a-Cause type, but it’s true. Technology fascinates me.
So for a few years now I have had a techie fantasy of getting an HDTV and using it as a big computer monitor. After all, LCD projectors work very well for displaying PowerPoint presentations for a whole room. Of course, a nice TV like that costs money (and I could forget about the projector too), so my dream was deferred. That was, however, before, Craig’s List.
Over the weekend I found an ad for a 26-inch Samsung CRT (read: heavy, bulky tube, not flat screen) HDTV for $125. I contacted the seller, saw the TV and talked him down to $100. That’s a $100 HDTV, folks. And the widescreen picture looks fantastic. I throughly enjoyed watching the NBA Finals on it even for an over-the-air signal.
My laptop not only has all the regular connections but also an HDMI output designed for doing this very thing. So I bought an HDMI cable from Walmart and hooked it up. And that’s when my heart sank.
There’s nothing wrong with the TV or the computer at all. Everything displays properly. In fact, Windows Media Center does a fantastic job of bringing high-definition Internet content to my TV screen.
But then I tried to surf the Internet like I normally do on my laptop, and that’s where the laws of physics betrayed me. Major fail. Reading text on the “big screen” only works if the font size is bumped up to somewhere around 20 points. That is, of course, why PowerPoint works so well on a projector…because it uses text that’s already big and blows it up to a size much larger than a 26-inch TV screen.
If anybody has any ideas on how I can correct this text problem, please record a video of how to do it so I can actually see it. I wouldn’t be able to read an e-mail on the big screen anyway.
I have to admit it. I have a problem. (That’s the first step to recovery, right?)
I love to travel.
I don’t get tired of moving around. I am hooked on seeing new and exciting places all the time. I love to explore. Whether it’s a new city, a breathtaking mountain range or a soothing beach
Right now I am writing this from a hotel room in Panama City Beach, which is 769 miles from home. I never really thought I was capable of homesickness, but I do miss people when I am separated from them.
I got a text message this morning on my cell phone and wished I could do more than just type back. This afternoon I was reading a book that I borrowed from a friend and imagined her explaining the notes she had written in the margin. (One more reason to borrow books instead of buying them!) I imagined another good friend daring me to do something dangerous that made me uncomfortable but was probably character-forming.
I realize now how fortunate I am to live in an era when I can stay constantly connected to the people I care about with the Internet and cell phones. A text message or a Facebook status is no substitute for being there, but it is a good patch-up for homesickness.
Likewise technology enables people to do more work while staying closer to their families. The other day, my manager was both at home AND at work thanks to her laptop. In my previous job I was able to stay in touch with my office even when I was away at a conference. I could answer my cell phone and customers could not tell where I was.
I realize that not everyone has a job for which this sort of virtual office makes sense, but it’s becoming more and more common than you might think. And in my humble opinion (IMHO), that’s a very good thing.
I’m typing this blog entry on my wife’s six-year-old Dell PC. We use it for web browsing, instant messaging and watching YouTube videos. Sure, I’ve added some memory and a $10 used DVD-ROM drive, but it’s fundamentally the same machine — and it still handles the daily tasks remarkably well.
Unfortunately, the one thing that it doesn’t do is travel. It’s a desktop, and my wife needed a computer that she could take back and forth to school. So we went laptop shopping.
And what did we find? Pink notebooks, green notebooks, thin-and-light notebooks, media center notebooks, desktop replacement notebooks…you name it. There’s something for everyone out there.
But what really struck me was the sheer power. Most consumer desktops and laptops today have dual-core processors–each core with as much power as our whole not-so-ancient computer–and truckloads of memory.
They use it too. The bare minimum for Windows Vista Home Premium is 1 gigabyte (that’s 1,024 megabytes) of memory, but it really takes 2 gigabytes (2,048 megabytes) to do it justice.
Now I’m not suggesting we all go back to DOS or even Windows 95, but Windows 2000 could run on 32 megabytes of memory and run very well on 64 megabytes. That means Windows Vista takes 32 times as much memory as Windows 2000 did.
Is it 32 times better? I think that I have used every version of Windows since 3.1, and I can safely say Vista isn’t even 2 times better than 2000.
At work, we have a few “dinosaurs” still running Windows 2000, and you know what? They still work very well. You can still surf the Internet, watch videos, listen to music, and even leave a few windows open at the same time. In fact, I used to surf the Internet on — GASP! — a 486 running Windows 95 with 16 megabytes of memory!
As long as you keep the spyware away with a good antivirus program and keep the junk off the hard drive, they can still do it all at a very good clip. And why shouldn’t they…that’s what they could do when you bought them, right?
Sure there have been some significant innovations along the way that justified an upgrade. Windows 3.1 was the first really usable version. Windows 95 was the first one that was really designed for the Internet with TCP/IP. Windows NT Workstation (circa 1996) brought a new level of stability instead of the hated blue screen of death (most consumers didn’t benefit from this huge innovation until Windows XP). Windows 98 Second Edition brought us USB support. These were all critical upgrades that we take for granted today.
But what does Vista bring us? Transparent windows that hog our video memory. Gadgets just like the ones found in Google Desktop. A confusing Start menu. Annoying security prompts for everything. Sure, there’s a cool factor, but there really isn’t anything in there that justifies the stiff hardware requirements or the extra cost.
So even though I couldn’t talk my wife into a perfectly capable used laptop from eBay, I did talk her into sticking with Windows XP instead of Vista. If a laptop with 2 gigabytes of memory and a dual-core processor will run Windows Vista Home Premium reasonably well, it will chew up XP, spit it out and ask for seconds.
Then again, maybe I won’t even notice.
P.S. I know I’m starting to sound like an old codger and there are some high-end users and gamers who really put the extra computing power to good use.
Update: I wrote this way back in late 2007 on MySpace and moved it over to this blog. It’s now May of 2013, and there have been lots of changes since then.
1) Windows 7 launched on October 22, 2009. It was what Windows Vista should have been in the first place. Windows 8 launched on October 26, 2012. It was a big, polarizing change that made it easier to run Windows on a tablet — even though you could technically do capacitive touch in Windows 7.
2) Microsoft announced that Windows XP support will end on April 28, 2014. That means no more security patches for XP, and that could lead to big security problems for XP users. Now there’s an extremely good reason to upgrade. If you still have an older machine that you don’t want to sink any money into, you might try switching to one of the many Linux distributions available for free. Ubuntu is probably the most popular and straightforward distribution out there, but there are lots of options. You may be pleasantly surprised at how fast your old hardware runs on Linux!
3) Newer computers aren’t just faster and more powerful these days; they can do entirely new things. Certain laptops running on variations of Intel’s Core series processors can beam content to your HDTV wirelessly with an adapter (called Wireless Display or WiDi). Of course there’s the touch/tablet revolution as well with lots of interesting variants. Low-voltage processors can really improve battery life, so your laptop can last longer without being plugged in. A few companies have come out with portable Android-based stick computers that can operate just by being plugged into the HDMI port on a television. For well under $100.
Glad you're here but a little concerned that you haven't found a better use of your time than this