Thursday, December 22, 2011

Seven Ways to Get Yourself Hacked

As targeted scams become more common, it's vital to protect yourself.

In recent months, I've met at least three people who have been the victim of hackers who've taken over their Gmail accounts and sent out e-mails to everyone in the address book.
The e-mails, which appear legitimate, claim that the person has been robbed while traveling and begs that money be wired so that the person can get home. What makes the scam even more effective is that it tends to happen to people who are actually traveling abroad—making it more likely that friends and families will be duped.
Although it's widely believed that a strong password is one of the best defenses against online fraud, hackers increasingly employ highly effective ways for compromising accounts that do not require guessing passwords.


This means that it is more important than ever to practice "defensive computing"—and to have a plan in place for what to do if your account is compromised.

Malware. Sometimes called the "advanced persistent threat," a broad range of software that was programmed with evil intent is running on tens of millions of computers throughout the world.
These programs can capture usernames and passwords as you type them, send the data to remote websites, and even open up a "proxy" so that attackers can type commands into a Web browser running on your very computer. This makes today's state-of-the-art security measures—like strong passwords and key fobs—more or less useless, since the bad guys type their commands on your computer after you've authenticated.
Today, the primary defense against malware is antivirus software, but increasingly, the best malware doesn't get caught for days, weeks, or even months after it's been released into the wild. Because antivirus software is failing, many organizations now recommend antediluvian security precautions, such as not clicking on links and not opening files you receive by e-mail unless you know that the mail is legitimate. Unfortunately, there is no tool for assessing legitimacy.

Windows XP. According to the website w3schools, roughly 33 percent of the computers browsing the Internet are running Windows XP. That's a problem, because unlike Windows 7, XP is uniquely susceptible to many of today's most pernicious malware threats. Windows 7, and especially Windows 7 running on 64-bit computers, has security features built in to the operating system such as address space randomization and a non-executable data area. These protections will never be added to Windows XP. Thus, as a general rule, you should not use Windows XP on a computer that's connected to the Internet. Tell that to the 33 percent.

Kiosk computers. You should avoid using public computers at hotels, airports, libraries, and "business centers" to access webmail accounts, because there is simply no way to tell if these computers are infected with malware or not. And many of them are running Windows XP. So avoid them.

Open Wi-Fi. Wireless access points that don't require an encryption key to access don't protect your data as it transits through the air. This means that your username and password can be "sniffed" by anyone else using the access point as well. I haven't been able to find any reports of malware-infected laptops running sniffers at coffee shops, but it's really just a matter of time. The only way to protect yourself is to be sure that the websites and e-mail servers you use employ SSL ("https:") for everything, not just logging in.

Man-in-the-middle attacks. Those same open Wi-Fi access points can sniff your password using a variety of so-called man-in-the-middle attacks, in which your computer sends information to the wrong website, which, in turn, passes it to the correct one—so that the communication channel seems fine.
Man-in-the-middle attacks are especially easy over Wi-Fi, but they can take place anywhere on the Internet. Man-in-the-middle attacks can also be implemented through malware. Here even SSL is not enough—you need to be sure that the certificate of the SSL-enabled website is legitimate (a forged certificate will tell your browser that it's connecting to the right site using SSL). Most people also ignore certificate mismatch errors.

Phishing scams. Surprisingly, a fair number of users still fall for phishing scams, in which they voluntarily hand over their username and password to a malicious website. Typically users end up at these sites when clicking on a link they receive by e-mail.

Different website, same password. Finally, many websites (including major newspapers and magazines) require that you set up an account with an e-mail address and a password in order to access their content. Don't use the same password that you use to access your e-mail—otherwise the website owners (and anyone who hacks that website) will be able to take over your other accounts, including your e-mail.
What happens if you follow all of these precautions and your e-mail account still gets compromised?
Here are some ideas:

Be an authentication pioneer. Google, E*Trade, and other firms have deployed systems that allow you to augment passwords with your cell phone or a handheld security token. Although these systems can be defeated with malware, they are still more secure than passwords alone. Currently you need to opt in to these systems. If you care about your security, you should be a pioneer and give them a try.

Be prepared. Google, Facebook, Apple, Amazon, and others allow you to take proactive security measures to protect your account in the event that the password is compromised. This includes registering alternative e-mail addresses, registering cell phone numbers for backup authentication, and providing answers to "secret questions." Unfortunately, you have to do this before your account gets hacked, not after.

Be alert. Facebook allows you to provide a cell phone number that gets an SMS message whenever someone logs in using a different browser. This is a simple, effective way to monitor when someone other than you accesses your account. If your account is accessed, you'll be in a race to change your password before the attackers do.

Maintain multiple accounts. Don't put all of your eggs in one basket! Have accounts at multiple e-mail providers—and accounts at multiple financial institutions for your money, as well. That way, when you get hacked, at least you'll have a backup.

Keep offline copies. Finally, don't keep the sole copy of your precious data at some cloud provider—download your data to your home computer, then burn it to disc or copy it to a disconnected hard drive. That way, even if you lose your online access, at least you'll have a copy.

Wednesday, October 5, 2011

World’s Cheapest Tablet-Sakshat Ipad by India

Shakshat Tablet 
 Now this is unbelievable. But we have to believe when its all about growing technology and the giant super software market leader India’s next punch. First it launched the world’s cheapest laptop and now  after a series of delays, the much hyped Indian made world’s cheapest tablet, Sakshat Tablet set to hit the market. According to media reports, the new Sakshat Tablet will be delivered to IIT Rajasthan students by the end of Jun and the interesting fact is its price, just Rs 2,200.
government announced that 10,000 Sakshat tablets will be delivered to IIT Rajasthan in late June. More over, around 90,000 Sakshat Tablet will be available in the market within three months with an attractive price tag of just Rs 2,200.
Sakshat tablet manufactured by HCL Technologies will have features like 7 inch touchscreen and Linux operating system. This tablet will be powered 2GB RAM and 32GB HD. Other features include Wi-Fi, USB port, Bluetooth etc. Stay tuned for more updates.
 
 
 
Specification:

Hardware:
- Processor: 366 Mhz. Connexant with Graphics accelerator and HD Video processor
- Memory (RAM): 256MB RAM / Storage (Internal): 2GB Flash
- Storage (External): 2GB to 32GB Supported
- Peripherals (USB2.0 ports, number): 1 Standard USB port
- Audio out: 3.5mm jack / Audio in: 3.5mm jack
- Display and Resolution: 7" display with 800x480 pixel resolution
- Input Devices: Resistive touch screen
- Connectivity and Networking: GPRS and WiFi IEEE 802.11 a/b/g
- Power and Battery: Up to 180 minutes on battery. AC adapter 200-240 volt range.

Software:
- OS: Android 2.2 / Linux
- Document Rendering
* Supported Document formats: DOC, DOCX, PPT, PPTX, XLS, XLSX, ODT, ODP
* PDF viewer, Text editor
- Multimedia and Image Display
* Image viewer supported formats: PNG, JPG, BMP and GIF
* Supported audio formats: MP3, AAC, AC3, WAV, WMA
* Supported video formats: MPEG2, MPEG4, AVI, FLV
- Communication and Internet
* Web browser - Standards Compliance: xHTML 1.1 compliant, JavaScript 1.8 compliant
* Separate application for online YouTube video
- Safety and other standards compliance
* CE certification / RoHS certification.
While the tablets will be priced at Rs 2,200, there are reports of plans for later subsidies of 50%. The 1500 Rupee tablet might go for Rs. 1100, and has been developed as a part of the National Mission on Education through Information and Communication Technology to bring together 25,000 colleges and 400 universities in the Asian subcontinent in an e-learning initiative.
I am just waiting to grab one in my hand and even waiting for the global world’s reaction.

Thursday, August 18, 2011

Cloud Computing

Hi Friends..
Have you heard about Cloud Computing...
I think now you are getting what I am saying.
yes I am talking about that Cloud Computing which is in booming way.
I am going to tell you about the Cloud Computing.
----------------------------------------------------------------
Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet).


Overview

Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallels to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service.

The concept of cloud computing fills a perpetual need of IT: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT's existing capabilities.

Cloud computing describes a new supplement, consumption, and delivery model for IT services based on Internet protocols, and it typically involves provisioning of dynamically scalable and often virtualized resources. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. This may take the form of web-based tools or applications that users can access and use through a web browser as if they were programs installed locally on their own computers.

Cloud computing providers deliver applications via the internet, which are accessed from a Web browser, while the business software and data are stored on servers at a remote location. In some cases, legacy applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology, while the computing resources are consolidated at a remote data center location; in other cases, entire business applications have been coded using web-based technologies such as AJAX.

Most cloud computing infrastructures consist of services delivered through shared data-centers and appearing as a single point of access for consumers' computing needs. Commercial offerings may be required to meet service level agreements (SLAs), but specific terms are less often negotiated by smaller companies.

---- continue---
  
Comparisons
Cloud computing shares characteristics with:
  • Autonomic computing — Computer systems capable of self-management.
  • Client–server model — Client–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).
  • Grid computing — "A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks."
  • Mainframe computer — Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.
  • Utility computing — The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity."
  • Peer-to-peer — Distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model).
  • Service-oriented computing – Cloud computing provides services related to computing while, in a reciprocal manner, service-oriented computing consists of the computing techniques that operate on software-as-a-service.
Characteristics
Cloud computing exhibits the following key characteristics:
  • Agility improves with users' ability to re-provision technological infrastructure resources.
  • Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
  • Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).
  • Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
  • Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilization and efficiency improvements for systems that are often only 10–20% utilized.
  • Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.
  • Scalability and Elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.
  • Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer. They are easier to support and to improve, as the changes reach the clients instantly.
Architecture


Cloud computing sample architecture
Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over loose coupling mechanism such as messaging queue.

History

The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network, and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.
Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture, autonomic, and utility computing. Details are abstracted from end-users, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.
The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.
The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s offered primarily dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider and that which was the responsibility of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure.
After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.
In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds. In the same year, efforts were focused on providing QoS guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project. By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "rganisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas."

 ---- continue---