May 29, 2011

Samsung Chromebook Series 5 Netbook by Samsung and Google


Samsung and Google have teamed up to produce the series 5 “Chromebook”. The 12, 1-inch notebook computer weighs 3.3 kg and measures only 0.79 inch thin. Samsung calls it a Chromebook because it is running Google chrome operating system. Samsung says that the notebook is good for 8.5 hours of battery life or five hours to watch video.
Notable features of the Samsung series 5 Chromebook include a 16: 10 aspect ratio screen with 300 nit brightness and an anti-glare coating, a full-size keyboard, oversized touchpad, a start time of less than 10 seconds, a HD webcam, two USB ports, and Intel dual core processor (model not specified). The battery in the notebook is rated up to 1000 longer than conventional batteries cycles or three times for a lifetime. The notebook will be available with built-in Verizon 3 G mobile broadband access.
For availability and rates
The Samsung series 5 Chromebook to 15 June 2011 from Amazon.com and BestBuy.com. The WiFi + 3 G model will retail for $499.99 and Wi-Fi only model $429.99.

Sony Alpha 33 Reviews



With the 33 alpha and Alpha 55 send to Sony to revolutionize the concept of single-lens reflex. The “SLT” cameras offer a classic vibrating mirror no longer, rather than an optical viewfinder, there is an electronic. This approach promises small and lighter body, as well as clear advantages for the auto focus. In our detailed review, the alpha 33 must show whether she can keep the promises of Sony. In addition, we take some equipment highlights the camera under the microscope and intensive look at the image quality.

Ergonomics and processing you may believe that the alpha 33 is a “reflex”, so delicate it comes along. In fact, it is absolutely no DSLR, because there is a vibrating mirror in the alpha 33 no longer. Instead, Sony has installed a fixed mirror in the camera, which is translucent for the most part. Only 30% of incident light are redirected and fall on special phase comparison of autofocus sensors. The vast majority of light happens, however, the “mirror” freely and falls directly on the sensor. He not only ensures the recording, but provides also the viewfinder image such as on a compact camera. “Single Lense Translucent” (SLT) calls this concept, where “Transclucent” is “partly permeable” Sony. Keeping the alpha 33 to the eye, is one immediately clear how this approach differs from the classic single-lens reflex camera: the alpha 33 has no optical viewfinder, but electronically creates the view. The electronic viewfinder (EVF) with more than a million pixels it resolves so fine that even the finest image details are best to recognize. Also, the focus is good to assess with the video viewfinder. Contributes to among other things, that the EVF nominally is one of the largest viewfinder on an APS-C camera. It shows the image area to 100 percent in 1,1-facher view magnification.
What reads as impressive on paper, in practice but due to a tiresome detail quickly becomes a nuisance: the exit pupil is low in the engine Bay, in particular people who wear glasses so not close enough to the eyepiece approach. Despite the relatively lush Sucherbilds, one gets the impression, to look into a tube with the glasses on the nose. You look so better without nose bike in the viewfinder, a diopter correction of +/-4 dpt. makes it possible most of the time. Two other characteristics of the electronic viewfinder are, however, in practice of less important: who takes care, seems to perceive a light Grieseln. And something lubricates the viewfinder image for quick panning, calms down but again as soon as the camera has come to rest. On the small shortcomings of the viewfinder you looks over at the latest when a photo opportunity in the night: the EVF regulated: the brightness to, even when the control key is pressed. So you have a very bright viewfinder image, in slowing light always but at the cost, that the display is increasingly noisy. And in a further point is clearly superior an optical viewfinder the EVF: he shows much more information. So you can show a live histogram on request or one artificial horizon, which accurately informed about the orientation of the camera.
In addition to the EVF, the alpha 33 (along with the alpha 55) as the first DSLR from Sony provides a display that is rotating, swivelling and folding. This Sony has opted for a somewhat unusual construction: the display is attached to the bottom of the housing via joint. In portrait shots, this is a clear advantage because of the small control monitor remains closer in the optical axis as a mounted on the side of the housing. We mounted the camera on a tripod, this design however is a disadvantage: tripod head or removable standing over backwards, the display only by maximum 90 degrees can be folded and above all no longer turn into the correct position. For this, it treats with full VGA resolution (well 920,000 pixels) and a lush size by three inches in the diagonal. As earlier Alphas, the alpha of 33 has an eye sensor under the viewfinder eyepiece. He switches to automatically from the display on the video viewfinder, if one takes the camera in front of the eye.
As a whole, which offers Alpha 33 so despite (or because) of EVFs in connection with the display a great viewfinder wearing comfort. How it looks but with the operation? The housing of the camera is slightly larger than a bridge or super zoom camera. Nevertheless, it has managed Sony accommodate for all important functions of dedicated buttons and switches. However, the buttons fall off bad small and are also not always ergonomically. This applies in particular to the important dial: it is much too far down on the clearly marked but slim handle mounted. Where it should actually be namely under the index finger is the rotating main switch. So increases the risk that you accidentally disconnect the camera, rather than to advance, for example, the aperture. The very compact body is still well in the hand, better for photographers with small hands. A professional lens of the Schlage of a Carl Zeiss 24-70/2.8 on the mounted man however Alpha 33, the whole combination is bad top-heavy and makes one-handed keep hardly sure. And the very compact housing still a downside: it has room for only the battery NP-FW50 developed for the NEX. Its capacity is enough for a maximum of around 340 pictures at exclusive image control on the display, with the EVF, the battery is already exhausted after about 270 photos. Sony manufactures the alpha 33 schn̦dem plastic enclosure Рthis is not necessarily a disadvantage. So was the camera to the real light weight. While she makes a robust impression, nothing crackles or creaks in the courageous access. Beautiful, also, that especially frequently used functions quickly reach can on a special function key Рa trip to the well structured menus is so rarely necessary.
Facilities you can see it the alpha 33 not equal to, that Sony has powered its a very rich and above all practical facilities. For example, which is new for Alpha DSLRs “Auto +” mode: will he set on the dial wheel grip, the camera chooses the appropriate recording program suitable to the subject. If you want to shoot just easygoing, goes well with this function. In addition there are the “green fully automatic” or the ability to manually set one of the nine scene modes. Would you prefer itself determine how the camera will take? No problem, the alpha 33 offers the usual Halbautomatiken (time or aperture priority) and can be also fully manually controlled. This not only applies to the exposure, ISO sensitivity can be manually or automatically by the camera. As well, the focus can be adjusted automatically or manually.
The special features of Alpha 33 fall of only at the second glance on – such as a pressure on the “D-range” button: Now you can will turn on a HDR function with the camera in very quick succession merges several differently exposed images to an image with perfect dynamic range. Up to six f-stop difference can you here for the single recording set – or alternatively the dynamics of the HDR automatic fix can be. Another very useful innovation hides in the ISO menu: here, you can choose this a “Multishot noise reduction” for all ISO levels. The alpha 33 takes up then again six images, combining them to a photo with amazingly low noise. This is possible because sensor noise is a stochastic phenomenon – noise pixels occur on virtually never several recordings in the same place. This works best with static images, taken from the tripod out.
As well as the “classic” features Sony has not saved: the alpha 33 has a control button, a self timer, allows the inclusion of bracketing (with 0.3 or 0.7 EV difference), knows several methods for measurement of exposure (Multi-pattern, multi-field and spot measurement) and rich Flash features. So, it is for example possible to control system flashes unleashed with the camera. Flashing on the second curtain dominated them as well as the slow sync. Also, the Flash output can be set easily from the function menu. This allows perfect to mix ambient light and Flash. The camera of House from already on Board has a small Flash, he jumps up if necessary and assists the auto focus in dark conditions with strong Flash salvos. The alpha 33 offers practically everything you expect from a good mid-range DSLR. These include also class common connectivity, as a HDMI output, or a socket for a system remote control. In the serial image speed she swings up then the professional class: fast eight images in the second shoots the camera on request – the missing voice mirror makes it possible. Only high end-DSLRs, which cost a large four-digit amount are so fast.
Should the alpha 33 in high speed mode (AF-C), updating the focus it fixes the aperture on F3. 5 and the maximum aperture. The exposure can customize continue to it for each frame, but only on the variation of exposure time and/or ISO number. The Panel control inherited from Minolta will be responsible for this restriction. It allows up to six shots per second, faster to switch the camera between the open plate (for the auto focus) and the working aperture. You would like to set a smaller aperture – about to also in high speed recordings the depth of field to specifically control-, gives only with static auto focus (AF-S). Now, the alpha 33 freezes the focus on the value of the first image of the series. If you exposure and auto focus updating without restrictions would can do with a still respectable series frame rate of six shots in the second. When the serial image it – unlike comes as in a traditional DSLR – not to the “blackout” in the viewfinder of the alpha 33; There are Yes No vibrating mirror, which obscured the view. However, the camera in high speed mode can not continuously be the viewfinder image. Instead, it shows the picture just taken. So, you can see a “slide show”, in which each picture for 0.15 seconds appears during the recording series. Mitziehern requires the some change: A fast motif is already a significant step further by the image area wandered, as it suggests the viewfinder.
Full end to the wearing the advantages of the SLT concept come then, when it comes to video. No vibrating mirror must be weggeklappt only awkward, according to a press of the “movie” button, the alpha 33 virtually no delay before starts the film recording. While the rapid phase AF provides faster hot than any other DSLR video mode. Great also: During the video filming one of the 15 AF fields can be at any time select and so the focus quickly for example of the front set to the background. The exposure compensation work also during the video recording. The aperture, however, is the alpha 33 enabled af video fix on F3. 5 and the maximum aperture of the lens used. The default of other Aperture values is possible, then must be focused on the video recorder manually. For the good movie soundtrack, a stereo microphone in the A33 provides the camera also offers the ability to connect an external microphone. Films records the camera in full-HD (1080i) and stores them either in the space-saving AVHCD format or nachbearbeitungs friendly as MP4 files.
Objectively, the alpha 33 is available in the set together with the lens of “SAL 1855″. The zoom lens with a focal length of 27-82 mm, completely made of plastic (by Kleinbild) weighs just once about 100 grams. This arouses first no confidence, but the set lens scattered first concerns quickly. So, it – convinces thanks to built-in servo motors – with a hurtigen auto-focus, which has found its goal mostly within one-third of a second. Previous focused raises the camera already after about 0.15 seconds. The auto focus however has problems in burst mode: Although he has no difficulty, an object that moves across through the picture, quickly to reach enough of a point to the next. Moves to the object but directly on the camera, the alpha 33 regulates the focus often not fast enough to, some footage from the series are not properly in the focus. Although the alpha 33 with a phase-AF, with special AF sensors works, it provides a face and smile detection. It relates the information about the recording sensor. This can cause that the alpha 33 while recognizes a face, but not that sharp can provide, if it is outside their 15 AF fields.
As for Sony, the lens is not stabilized. For the Verwacklungsschutz the Alpha provides 33 with their “SteadyShot” – the recording sensor is movable and compensates as Zitter movements of the hand of the photographer. The Bildstabi free-hand taking up too much time allows longer exposure times. Because the view of the main sensor is generated, it is stabilized – for the first time in a system camera from Sony a further advantage of the SLT concept. From a photographic point of view, there is so little blame for the SAL 1855. is annoying, that the front lens focusing with turns, the use of a polarizing filter is so complicated. The set lens is but not so well suited for video recording. The zoom ring has a very high torque, which wants to be first of all overcome. Soft zoom rides are so difficult. Furthermore, you can not silently change the focal length; rotate the zoom ring, get ugly scraping noise on the soundtrack of the video recording.
Image quality from an ergonomic point of view is there on the SLT concept of Alpha 33 little to the criticism. But how does affect the placed firmly in the beam path, partially-permeable mirror the image quality? We are intensively investigated this question in the renowned DC-Tau test lab. How can always be obtained the detailed test protocol for a small fee (further links see). The sensor of the alpha 33 gets in absolute terms about 0.6 f-stops less light, as it is the combination of shutter speed and aperture (based on a traditional DSLR). (Note: 50% light loss is an aperture level, 30% are loss of light such as in the alpha 33 i.e. 0.6 f-stops.) It was to be feared, therefore, that this loss of light have a negative effect on the noise behavior and the dynamics of the input. But those fears quickly refuted the test protocol of the alpha 33: the noise curve remains to up to high ISO 1600 at a low level. This, the camera noise reduction takes precedence above all against the dreaded color noise and tieffrequentes luminance noise. Very fine brightness errors allow for the engineers of Sony, however, – so the detail is less, which irons Alpha 33 so not with the noise also equal to fine image detail is away. The ISO number rises to a value of 1,600, the details take off while visible – photos recorded but even at ISO 6,400 to paper bring is still in acceptable quality up to the size of 13 x 18 cm. Because one almost forgets that it records Alpha 33 on request also in RAW. Camera raw 6.1 can elicit is the RAW files of little details and less noise than the JPEG files directly from the camera eh already provide.
In terms of dynamic range, the alpha cuts a good figure 33: up up to ISO 3,200 she processed indemnify brightness differences of at least eight f-stops. Like so many other cameras, the JPEG files from the alpha 33 however show a high black level – black rendered rather than dark grey. The tone curve is with crunchy contrast in the midtones but rather soft lights and shadows for a “shot-to-print” camera: ideal, to just print the pictures, less suitable for further image processing. Also, the focus is more crisp, creates but hardly visible artifacts. Sony has designed the low-pass filter more aggressively, very fine structures in the motif increases the risk of moiré education. The alpha 33 knows the JPEG compression levels “Fine” and “Standard”, which preserves itself “standard” image detail to the detriment of small files.
The cheap set lens can however not fully exploit the potential of the sensor of the alpha of 33. Above all a strong resolution waste to the edges of the image there is problem, even stopping down doesn’t help as much. If you would like to fully exploit the potential for resolution of the camera, is to rely as “elegant glass”. Also older Minolta lenses may be, the 1984 introduced A lens mount Sony has maintained to this day. In the Randabdunklung this can be seen in 1855 less critical SAL, vignetting is virtually a non-issue. Also, the distortion of the lens is fully fine with the exception of barrel-shaped Weiwinkels. In practice, it convinces Alpha 33 with a very reliable exposure. Otherwise as previous Alpha DSLRs, we have seen no inclination towards the overexposure. The camera renders colors strongly but with very neutral tones. What is but perhaps most important: in the light path fixed, partially-permeable mirror leaves no trace in the recording. Even with extremely high-contrast images with point-like light sources occurred in our test not to ghosting.
Conclusion Sony has some courage with the radical departure from the classic mirror reflex concept at the alpha of 33. The lack of a vibrating mirror however many advantages: the alpha 33 works always in “LiveView” mode, always offers a fast and accurate phase AF and provides a stabilized view. In the video recording, the SLT concept of Alpha 33 of all current DSLRs is superior. Also, so series frame rates are possible reserved so far expensive professional DSLRs. Also the housing dimension is – reduced significantly due to the loss of the mirror but at the cost of some fummeliger buttons and a small, inefficient batteries. The decision for this concept was especially brave, because the alpha 33 offers no optical viewfinder. But its electronic viewfinder does really well his stuff, he is bright, large and shows a detailed view with a wealth of information. The image quality is not affected by the SLT concept, it is entirely on the hoe.

HP mini 210 Reviews


Laptop Mag reviewed the HP mini 210 and had to say the following:

Pros:
Spacious keyboard, clean design, large touchpad, good HD video playback, strong software bundle

Cons:
Below is hot, slow boot time, shorter battery life than other Netbooks, Broadcom chip not improve Flash playback (nor), below-average performance
The bottom line continued after the break

Bottom line:
You said that the HP mini 210 (Sony VAIO offers also a HD screen W) one of the best keyboards in his class with an HD screen at a reasonable price offered. If you need HD capabilities, for $85 of less you can an ASUS Eee PC 1001P, which has also an excellent keyboard, slightly performs better, and takes about 2 hours longer on battery. In the field 380ish price $ prefer the Toshiba mini NB305 run because of the better ergonomics and cooler temperatures. The Toshiba NB305 happens, what I will be named as the best NetBook of CES. I will add that the HP mini 210 in a ton of different configurations will be so sure, browse our NetBook comparison database, to sort through the different models and prices.

Fast product specifications:
Intel Atom N450 1.66 GHz processor, 1 GB RAM, 250 GB hard drive, Windows 7 Starter see full details
Note: I’ll update this review with more of my own impressions when HP sends me a mini 210, but you notorious are that that never enough demo models to send.

7 steps to set your website on the fast track to success

To get on the fast track to success one needs to: 


 Create a website that is dynamic and distinctive. The website name should match the domain name. Bad or broken links must not exist. JavaScript errors must be eliminated. The company’s profile should be clear, concise, and complete. Secure ordering must be in place if required. And, visible links to the company’s business plan, privacy policy, return policy, and guarantee should be present. 


Employ a design with user in mind. Never use heavy images, 10-12Kb per image will ensure that pages are not slow. Use graphics that enhance content. Avoid images that change color or blink. Use standard layouts that are reader friendly, the page should breathe and font size must be comfortable. Use a few fonts: serif for headlines and sans serif for text. Limit the number of advertisements, banners, and links on a page. Be sure to test yourwebsite using multiple browsers. 


 Select a directory with vision. Read all the submission requirements and guidelines more than once. Choose a category with thought and planning (browse the directory, look for listings of competitors, and related sites). Review your website from an editor’s point of view. Ensure that the title and summary are appropriate and relevant to the content of the web site. List the strengths of your site realistically; be sure to add value to your site, *Increase traffic by submitting the site to web directories. Choose to submit to major ones as well as minor ones, even a few relevant niche sites will boost your popularity and traffic.


 Link your site to others. Search engines give higher positions to sites linked from others. Link the website to major sites as well as minor ones. Contact high traffic sites and request a mention or link. This will boost your engine placement and direct traffic from the pages that are linked.


Optimize the PageRank of your website by choosing inbound external links with care. Link up to relevant sites and not at random, quality is the criteria to consider not PageRank. List the website in an open directory and Yahoo as this will provide an artificial enhancement of PageRank. Never place external links on pages that are in turn linked to other sites. External links should be offered on pages with low PageRank containing many links to pages on your site. Construct your navigational structure such that important pages are linked to many other internal pages that do not require a high rank. These extra links will add rank to the major pages. * Create a site map and link every page to it. These invite spiders sent out by search engines which then index every page on the site. Adopt an easy-to-use navigational structure. Check for errors regularly. Include a site search tool. Be search engine friendly and avoid frames, flash, or code that will trip up a spider or engine. It is not submitting to a directory that ensures success but taking care of the nitty gritty.

Hard Drive Making Noises


A hard disk drive (HDD, or also hard drive) is a fixed data storage device that stores data on a magnetic surface layered onto hard disk platters. A hard disk uses rigid rotating platters (disks). Each platter has a planar magnetic surface on which digital data may be stored. Information is written to the disk by transmitting an electromagnetic flux through an antenna or read-write head that is very close to a magnetic material, which in turn changes its polarization due to the flux.
The information can be read by a read-write head which senses electrical change as the magnetic fields pass by in close proximity as the platter rotates.

Disk failure occurs when a hard disk drive no longer operates and the information on it can no longer be accessed by the computer.
This can happen for no reason at all or due to an external factor such as exposure to fire or water or High Magneticwaves or suffering a sharp impact How seriously the disk failure is varies.
The most serious and well-known kind is the head crash where the internal read-and-write head of the device touches a platter or a magnetic data storage surface. A head crash normally causes severe data loss and, moreover, data recovery attempts may cause further damage if not done by somebody with specialized knowledge.
There are also controller electronics which occasionally fail. If that happens it may be possible to recover all data.
When your computer's hard disk starts to act funny, make sure that you have an up-to-date backup. Then you can prepare some simple diagnostics and possibly repairs. Both Windows and Mac OS come with built-in hard-disk utility software that scans your hard disk for errors and attempts to fix them.
This is what you can do for Windows Steps: 1. Double-click on My Computer to open the My Computer window. 2. Select the disk that you want to diagnose and repair. 3. Choose Properties from the File menu. You can see the Properties window for the drive that you selected. 4. Choose the Tools button. 5. Click the Check Now button under Error Checking Status. 6. Choose either "Thorough" or "Scan for and Attempt Recovery of Bad Sectors." 7. Click on Start.
Tools to recover data are various such as:

There is a Data Recovery Utility which examines your inaccessible hard drive for damages and corruptions and recovers the data back.
Hard drive can be recovered from Windows 9x / ME / NT / 2000 XP / 2003 disk recovery Supports FAT16, FAT32, NTFS, NTFS5 file system Provides data recovery from Formatted logical drives Recovers data from Re-partitioned & Missing/ Lost Logical drives Lost folder and deleted file recovery.
There is also a partition recovery utility available that helps you in recovering all your important data lost after accidental formatting, virus problems, software malfunction,and file or directory deletion. There are also easy to use FAT & NTFS Partition File recovery Utilities that examine your inaccessible hard drives for damages and corruptions and recover the data.
There are also systems which:
Provide Partition recovery from FAT16, FAT32, NTFS, NTFS5 file systems. Recover deleted files/folders even after recycle bin has been emptied or use of Shift+Del key. File recovery from deleted partition, lost partition missing partition or formatted logical drives. File recovery from Missing or Lost folders. Recognize and preserve long file names when restoring files & folders. Multi-Disk Drive Support - Performs NTFS recovery on all IDE, EIDE and SCSI disk devices.
There is also software that recovers corrupt or lost data from floppy disks.
The software supports inaccessible floppy disk data retrieval, FAT, BOOT or ROOT area damaged situations, and rescues various corrupt files from a diskette. BadCopy Pro can recover data from floppy disks that Microsoft Windows identifies as "not formatted", "not accessible", or prompts you to format.
The software does not write data to your floppy disks, but saves the recovered data to a new file that you specify. It can fix corrupted files on floppy disk. It can restore damage to a FAT (File Allocation Table) or BOOT area of the disk. It can repair damage to the ROOT area of the disk, so that files can be listed again. It can erase Viruses on a diskette.
There is also software available that fixes corrupted or damaged Microsoft Word documents.
There is also a new product on the market; software that creates a Windows CD for you, and creates recovery files so you will not need to format to reinstall Windows.
It has been asked many times "I have Windows XP (or ME), how can I install Windows without having to lose all my files. I only have a Recovery Disk". Well it is really very simple, so long as you have a CD burner; or at least a second hard drive.

How To Increase Speed of Pc


There are several reasons that may cause you to want to improve the performance of our PC. The first is that nowadays, new programs are often large consumers of resources, so it is in the order of things that you want to optimize your PC in order to run the software without problems. It is also possible that your PC is running very well so far, and suddenly you notice a significant decrease in performance. Whatever the case, we will list the effective approaches that allow you to optimize your computer.
How to improve the speed of my PC without investing money?
There are two ways to achieve it. First, perform a series of regular maintenance. This is particularly indicated if your PC is very new (although this is relative, my mother's PC in use once a week, is as old as one year after the mine after a month ... it depends also how often you use it).
Maintenance of computer
To optimize your computer, you should defragment the disk, clean registry, remove malware (rogue) that infect 90% of the computer and to delete temporary files and other log files that Windows Parks and sows . These maintenance operations are used to solve common problems that cause performance problems.
How to optimize your computer with a free program?
Optimized PC is a PC regularly maintained with the steps mentioned above. In addition, you can install one of the many free software that improve the speed of the PC, most of them include utilities that support all these elements that optimize the computer.
Take a look at this website telecharger.com in the "utilities" and you'll be spoiled for choice these programs go even further than these four tips, because they will defragment the registry, they allow you to choose which programs will start automatically and more.
A PC with the settings optimized
That's fine, you say, but how do when already done all the maintenance operations imaginable and that it would further improve the speed of the pc? It is doable if you're ready to make some concessions on Windows features. One we can most easily happen is probably the indexing of files.
Disabling indexing permanent files - a great saving of resources
This indexer facilitates searches for files on the hard disk. For my part, I use it only very rarely, yet this indexer runs continuously and therefore impact system performance: you can disable it if you wish to further optimize the speed of your PC.
Another factor that slows down the system, they are all the graphic effects on the windows, buttons, etc.. If, like me, you prefer the appearance and functional performance, disable any shadow effects, fades, transparency etc.. (Especially in Vista). This will alleviate a lot of resources, this will allow you to optimize your computer and improve the speed of your PC! Is you pc getting slow day by day.Then get Speedupmypc 2011 Full version legally and increase the speed of your pc.Also learn other Pc tricks for your computer maintaince


Apple Wireless Magic Mouse

Seamless Multi-Touch Surface Magic Mouse — with its low-profile design and seamless top shell — is so sleek and dramatically different, it brings a whole new feel to the way you get around on your Mac. You can't help but marvel at its smooth, buttonless appearance. Then you touch it and instantly appreciate how good it feels in your hand. But it's when you start using Magic Mouse that everything comes together.
The Multi-Touch area covers the top surface of Magic Mouse, and the mouse itself is the button. Scroll in any direction with one finger, swipe through web pages and photos with two, and click and double-click anywhere. Inside Magic Mouse is a chip that tells it exactly what you want to do. Which means Magic Mouse won't confuse a scroll with a swipe. It even knows when you're just resting your hand on it.



Laser-Tracking Engine Magic Mouse uses powerful laser tracking that's far more sensitive and responsive on more surfaces than traditional optical tracking. That means it tracks with precision on nearly every surface — whether it's a table at your favorite cafe or the desk in your home office — without the need for a mousepad.
Introducing Magic Mouse. The world's first Multi-Touch mouse.
The same Multi-Touch technology first introduced on the revolutionary iPhone comes to the mouse. It is called Magic Mouse, and it is the world's first Multi-Touch mouse. Click anywhere, scroll in any direction, and swipe through images on its smooth, seamless top shell. It works wirelessly using Bluetooth, so you do not have to worry about cables or adapters cluttering your workspace. And built-in software lets you configure Magic Mouse any way you want.
The MacConnection is offering Apple's revolutionary Magic Mouse for only $64.99! This mouse relies on the same multi-touch technology as the iPad, which allows you to scroll and much more all with a flick of your finger on the mouse. This is a good Magic Mouse deal if you take advantage of the free shipping with orders of $99 or more, but if you pay the standard $6.25 in shipping, then you are better off paying the $69.00 at Apple Store with free shipping to save a few bucks. Limited supplies available. No coupon code needed for either offer.

Apple Wireless Magic Mouse only $64.99
The Magic Mouse is a multi-touch mouse manufactured and sold by Apple,and it was announced and sold for the first time on October 20, 2009 The Magic Mouse is the first consumer mouse to have multi-touch capabilities Taking after the iPhone, iPad, iPod Touch, and multi-touch trackpads, the Magic Mouse allows the use of gestures such as swiping and scrolling across the top surface of the mouse to interact with desktop computers.
The mouse requires at least Mac OS X 10.5.8 and Bluetooth. It can be configured as a two-buttoned left-handed or right-handed mouse, but the default is a single button. It uses laser tracking for increased pointer accuracy over previous generation Apple mice Since its release, it has been included along with a wireless keyboard with the 2009 generation of iMacs, and with a wired keyboard with the 2010 Mac Pro workstations. It can also be purchased separately.
Initial reception to the Magic Mouse was mixed, with positive reactions to its scrolling functions but negative reactions to its inability to middle click (without any additional software), or trigger Exposé, Dashboard or Spaces (features offered by its predecessor). Many of those features can be enabled on the Magic Mouse with the use of third party tools.
Underside of the Magic Mouse
The Magic Mouse also has known issues with maintaining a stable connection to Mac Pro workstations
Connectivity
Wireless Technology

Bluetooth Connector
Wireless
Connector Type


Bluetooth General Color White Contents Magic mouse, (2) AA batteries, documentation Returns Policy This product is subject to our return policy. Please see our complete return policy for details. System Requirements A Bluetooth-enabled Macintosh computer Mac OS X v10.5.8 or later with Wireless Mouse Software Update 1.0 Warranty - Labor Call for Warranty Warranty - Parts Call for Warranty


How Is Cloud Computing Influencing The IT Industry?


   As we all know, the traditional way of building an IT environment is to buy servers, hardware, licenses and to install the software. This is a long and costly process, involving many infrastructure demands and long deployment cycles. This fully IT internal model may be commonplace, but IT as we know it today is being replaced by newer technologies.

Lately, cloud computing is causing a major shift in the IT industry. New technologies have been developed, and now there are various ways to virtualize IT systems and to access the needed applications on the Internet, through web based applications. This means no IT costs for hardware or servers.

his utility based and service oriented IT model is no longer a plain hardware or software market. Currently, vendors can offer email apps, production systems, security options, storage and backup services, to name just a few of the IT components that can be moved to the cloud.
But how do software developers and IT decision makers adjust themselves to this trend? How is the traditional IT industry affected by these newer available technologies?
* Traditional IT jobs are being changed, as new skills and specialties are increasingly demanded. Before moving to the cloud, the IT staff will need to fully understand the advantages of cloud computing and how it can be integrated into the current business model. Issues such as security and maintenance should be discussed upfront with the cloud computing vendors, and also a good IT department will have to oversee the migration and the ongoing relationship with the cloud provider. * The IT Infrastructure will be crucially changed, as more applications are being moved to private or public clouds. Software developers will have to adjust the ways they create and deliver applications. * The need for IT support staff is reduced, thus diminishing the cost with desktop support. However, a new need is created, which is training the employees to work with and understand the new systems and applications. * The effort to maintain the data is also diminished. However, moving the data to the cloud equates losing its physical control, as it is stored in the vendor’s data center. Although clients might not be comfortable with this fact, they should understand that data in the cloud can be safer than being in-house. This brings us to our next point. * Security: Enterprise cloud providers that offer a managed cloud solution have security experts on staff managing the applications, with security options included. A best-practice method is to store the data in more facilities to make sure it is safe. I believe this is better than do-it-yourself. * Highly customizable software: Most of the software that companies use is not “cloud-ready”. This is where the software developers intervene, by creating code especially designed for the cloud. Also, cloud providers should make their best in making this transition easy. However, once the applications are SaaS, the need for the IT department troubleshooting decreases.
I have always said that cloud computing is about shifting the interest from physical resources (IT resources and capital expenses) to efficiency and utility. In the end, cloud computing allows companies to focus on doing what they know best, and not on spending a lot of money and time on IT processes. I believe that the companies which fail to adjust to this trend are going to face serious economic and business disadvantages.

Virtual Private Server

Virtual private server is nothing but a marketing term used by Internet Hosting Services which refers to a virtual machine for the use of an individual customer availing the service. Even software running on the same computer as same as the individual customer computer, it absolutely work as a separate computer and for the need of the customer. It also has the privacy of the separate physical computer and can run just like a server computer if configured. The term VDS is less used for the same process compared to VPS, each of these can run their own operating system and can be independently rebooted, which helps the user a lot in configuring any process and using it. Partitioning of a single server is done so that it can be showed as multiple has been in practice since a long time, especially both on mainframe computer and mid range computers. It has become more popular with the development occurring day by day in case of virtualization of software and technologies for micro computers.
It bridges the gap between dedicated hosting services and shared web hosting services by giving independence of the services in software terms. It cost less compared to physical dedicated servers. It runs at its own operating system and is capable of running any software on other operating systems some doesn’t run in virtualized ones. There are two types of hosting, the first one is Unmanaged and other is unmetered. In the case of unmanaged one the customer is left to monitor and administer their own server and in the case of unmetered one, unlimited amount of data transfer is allowed on a fixed bandwidth line. Unmetered hosting is offered with 10 Mbit/s, 100 Mbit/s or 1000 Mbit/s.

Here the customer can use 3.33~ TB on 10 Mbit/s, 33~ TB on 100 Mbit/s and 333~ TB on a 1000 Mbit/s line. A VPS is also referred as cloud server it has two attributes which are- additional hardware resources can be added at runtime (CPU, RAM), and Server can be moved to other hardware while the server is running (automatically according to load in some cases). As an intermediary service between shared web hosting and dedicated server hosting, the actual hardware server is divided into several isolated environments. Each environment, or space of the hardware, has its own server software, mail server, and independent software instances and services. VPS hosting offers an extremely affordable hosting solution for business owners.

May 20, 2011

"Kids under 13 should be allowed on Facebook ":Mark Zukerberg


Mark Zukerburg, Facebook CEO at the NewSchools Venture Fund's Summit in Burlingame, Calif. earlier this week,

"Education is clearly the biggest thing that will drive how the economy improves over the long term," Zuckerberg said. "We spend a lot of time talking about this"

"In the future, software and technology will enable people to learn a lot from their fellow students."



Zuckerberg said he wants younger kids to be allowed on social networking sites like Facebook. Currently, the Children's Online Privacy Protection Act (COPPA) mandates that websites that collect information about users (like Facebook does) aren't allowed to sign on anyone under the age of 13. But Zuckerberg is determined to change this
"That will be a fight we take on at some point," he said. "My philosophy is that for education you need to start at a really, really young age."

"Because of the restrictions we haven't even begun this learning process," Zuckerberg said. "If they're lifted then we'd start to learn what works. We'd take a lot of precautions to make sure that they [younger kids] are safe."

Facebook secretly hired a PR organisation to asperse Google

The famous Social Networking site Facebook caught secretly hiring top Public Relations Organisation to implant negative stories about the Internet Search Gaint Google Inc.

PR firm Burson-Marsteller, got captured in a scandal for running a secret anti-Google asperse campaign on behalf of Social Networking site Facebook,
The Evidence – that damage relations between the two giants, that have already bitter rivals – came to light in leaked emails late on wednesday. Facebook later admitted that it had hired Burson-Marsteller to the Daily Beast.

Burson gave a statement yesterday faulting the Facebook. The statement said Facebook insisted on being kept anonymous, and that Burson should not have gone along with that request.

Paul Cordasco, a spokesman for Burson-Marsteller, told the Guardian yesterday that the assignment was not at all standard operating procedure and was against the company's policies.

Google refused to comment.

May 19, 2011

Symantec says Facebook apps leak personal data


Security company Symantec claims Facebook application coding error may allow third parties to access users' private details...

Facebook applications may leak users' private data to third parties, including advertisers, according to researchers at security giant Symantec.

The social network site allows third party applications, the most popular of which are games, to run inside an iFrame, a partition within a web page that allows it to run code from an external site.

Symantec claims that due to a coding error, Facebook's iFrame applications leak 'access tokens' to third parties such as advertisers or web analytics providers, granting them permission to access users' photos, messages and personal data.

"We estimate that as of April 2011, close to 100,000 applications were enabling this leakage," wrote Symantec research Nishant Doshi in a company blog post "We estimate that over the years, hundreds of thousands of applications may have inadvertently leaked millions of access tokens to third parties."

The company believes that those parties may not have realised that they could access that data.

Symantec has informed the social networking giant of the issue, it says. "Facebook notified us of changes on their end to prevent these tokens from getting leaked." It recommends that Facebook users change their passwords.

It is not the first time Facebook has been accused of inadvertantly leaking users' private data. In October last year, two Facebook users sued the company, alleging that the 'referrer headers' that tell advertisers when a user has clicked on an ad contain private data about that user's browsing history.

Facebook denied the charges, arguing that there had been no material damage as a result of the practice. A similar suit has since been launched against LinkedIn, the professional social network popular in the IT industry.

Google Chrome OS laptop rentals for $20 a month


Google is set to unveil a Chrome laptop “student package” tomorrow at its I/O developer conference for $20 a month, an unnamed senior Google executive tells Forbes.

If true, the move has the potential to completely reshape the way consumers adopt computers, and it will also serve as a not-so-subtle Trojan horse for Google’s online offerings.

The $20 monthly fee will cover both hardware and online services for the laptops, which run Google’s web-centric Chrome OS software, the executive said. It will likely serve as a precursor to an enterprise Chrome laptop offering, wherein businesses pay a slight premium over their $50 annual fee for Google Apps (the company’s web-based Microsoft Office competitor suite).

The Chrome laptops will likely feature the same mobile broadband capabilities as the CR-48. That computer shipped with built-in 3G access and included 100 megabytes of monthly internet free for two years. You could also opt for daily unlimited internet for $10, 1 gigabyte of mobile internet for $20 a month, 3 GB for $35 a month, and 5 GB for $50 a month.

Aiming the Chrome laptop subscriptions at students seems like a good choice at first glance. After all, paying $20 a month for a computer beats spending $600 or more for a full-fledged laptop. But most students would have a hard time relying solely on the Chrome laptops, since they won’t have access to key Windows and Mac software that some courses may require. Like netbooks, the Chrome laptops could serve as secondary machines — assuming they’re light enough.

Google will also need to offer students something far better than its CR-48 laptop, which was heavy and had one of the worst trackpads ever forced upon a computer.

Reports of a Chrome laptop subscription plan go back a few weeks, when Neowin heard pretty much the same information Forbes did today from a “reliable source.” That report also noted that Google will upgrade the Chrome laptop hardware and offer hardware replacements for the life of the subscription

Microsoft owns Skype for 8.5 billion $s


Microsoft announced that it was buying the company for $8.56 billion in cash.
Just days after reports that Google and Facebook were interested in partnering with, and possibly buying VoIP company Skype,
Last year, Skype had revenue of $860 million on which it posted an operating profit of $264 million. However, it overall made a small loss, of $7 million, and had long-term debt of $686 million. It was the second time Skype has been bought out; after being started in 2003, it was purchasd by eBay in 2005 for $3.1 billion. eBay then sold the majority of its stake in 2009 to a private investment group for $1.2 billion less than it paid.
he purchase was Microsoft’s biggest ever, surpassing even the $6 billion acquisition of advertising firm aQuantive in 2007. That alone makes it surprising; the company’s track record with large purchases is decidedly mixed. Danger, the exciting mobile technology company that produced the Hiptop, better known as the T-Mobile Sidekick line, was purchased for an estimated $500 million in 2008; the result of that purchase was the disastrous KIN phone and a complete failure to integrate the bought-in talent. The aQuantive purchase too had mixed outcomes, with Redmond unable to find a role for the Razorfish division before eventually selling it off in 2009, and the company’s continued inability to make a profit from online advertising.

Microsoft has in the last couple of years shied away from similar large acquisitions, sticking to buying smaller, easier-to-manage organizations, leading some to argue that this was a direct result of the digestive difficulties faced with the large purchases. A $7 billion Skype acquisition would show that perhaps Redmond believes it has resolved such problems.
Microsoft’s own software already has considerable overlap with Skype. Windows Live Messenger offers free instant messaging, and voice and video chat. It currently boasts around 330 million active users each month, typically with around 40 million online at any one moment. Microsoft has an equivalent corporate-oriented system, Lync 2010 (formerly Office Communication Server) that allows companies to create private networks that combine the communications capabilities of Live Messenger with corporate manageability. The underlying technology of both platforms is common, allowing interoperability between Live Messenger and Lync. The company also plans to integrate Kinect into Lync to create more natural virtual presences.
Skype, in contrast, has around a third the number of active users — 124 million each month — as well as fewer simultaneous online connections—typically 20-30 million. Its instant messaging and voice and video call features are broadly similar to those found in Windows Live Messenger, though arguably more refined.
Though the Skype userbase is very much smaller than that of Windows Live Messenger, it does have one key difference: about 8 million Skype users pay for the service. Skype integrates telephone connectivity, able to make both outbound and inbound phone calls, and while its online services are all free to use, these phone services cost money. Skype also has points of presence across the globe, making it easy to buy phone numbers in foreign markets to cheaply establish an international telepresence.
Skype certainly has some things of value. The telephony infrastructure would make a valuable addition to the Messenger/Lync platform. It could also tie-in well with Exchange 2010, which offers voicemail integration. Adding telephony to Lync, Exchange, and Live Messenger is certainly a logical way to extend those products.

Comments system

Disqus Shortname