Tuesday, January 29, 2008

SOCIAL BOOKMARKING

Social bookmarking is a method for Internet users to store, organize, search, and manage bookmarks of web pages on the Internet with the help of metadata.

In a social bookmarking system, users save links to web pages that they want to remember and/or share. These bookmarks are usually public, and can be saved privately, shared only with specified people or groups, shared only inside certain networks, or another combination of public and private domains. The allowed people can usually view these bookmarks chronologically, by category or tags, or via a search engine.

Most social bookmark services encourage users to organize their bookmarks with informal tags instead of the traditional browser-based system of folders, although some services feature categories/folders or a combination of folders and tags. They also enable viewing bookmarks associated with a chosen tag, and include information about the number of users who have bookmarked them. Some social bookmarking services also draw inferences from the relationship of tags to create clusters of tags or bookmarks.

Many social bookmarking services provide web feeds for their lists of bookmarks, including lists organized by tags. This allows subscribers to become aware of new bookmarks as they are saved, shared, and tagged by other users.

As these services have matured and grown more popular, they have added extra features such as ratings and comments on bookmarks, the ability to import and export bookmarks from browsers, emailing of bookmarks, web annotation, and groups or other social network features.

The concept of shared online bookmarks dates back to April 1996 with the launch of itList.com. Within the next three years, online bookmark services became competitive, with venture-backed companies like Backflip, Blink, Clip2, Hotlinks, Quiver, and others entering the market. Lacking viable models for making money, this early generation of social bookmarking companies failed as the dot-com bubble burst.

Founded in late 2003, del.icio.us pioneered tagging and coined the term "social bookmarking". In 2004, as del.icio.us began to take off, Citeulike, Connotea (focusing on social bookmarking for scientists), Simpy, Furl, and Stumbleupon were released, and Netvouz in 2005. In 2006, Ma.gnolia and Diigo also entered the bookmarking field. Sites such as Digg, reddit, and Newsvine are a related type of web service that provides a system for social news. In 2006, Connectbeam was the first company to launch a social bookmarking application squarely focused at businesses and enterprises, and continues to innovate in this direction. In 2007, IBM announced plans to enter the social software market, and the BBC web site added social bookmarking links for its news and sport articles, as many other news websites had done earlier.

This system has several advantages over traditional automated resource location and classification software, such as search engine spiders. All tag-based classification of Internet resources (such as web sites) is done by human beings, who understand the content of the resource, as opposed to software, which algorithmically attempts to determine the meaning of a resource. This provides for semantically classified tags, which are hard to find with contemporary search engines.

Additionally, as people bookmark resources that they find useful, resources that are of more use are bookmarked by more users. Thus, such a system will "rank" a resource based on its perceived utility. This is arguably a more useful metric for end users than other systems which rank resources based on the number of external links pointing to it.

There are drawbacks to such tag-based systems as well: no standard set of keywords (also known as controlled vocabulary), no standard for the structure of such tags (e.g. singular vs. plural, capitalization, etc.), mistagging due to spelling errors, tags that can have more than one meaning, unclear tags due to synonym/antonym confusion, highly unorthodox and "personalized" tag schemas from some users, and no mechanism for users to indicate hierarchical relationships between tags (e.g. a site might be labeled as both cheese and cheddar, with no mechanism that might indicate that cheddar is a refinement or sub-class of cheese). Services which allow both tags and folders for organizing bookmarks (such as Netvouz) make this less of a problem though.

Social bookmarking can also be susceptible to corruption and collusion. Due to its popularity, some users have started considering it as a tool to use along with Search engine optimization to make their website more visible. The more often a web page is submitted and tagged, the better chance it has of being found. Spammers have started bookmarking the same web page multiple times and/or tagging each page of their web site using a lot of popular tags, hence obliging the developers to constantly adjust their security system to overcome abuses. Because of this, some social bookmarking websites were forced to add CAPTCHA protection against spam, which caused some problems for people who use social bookmarking for non-spamming purposes.

Saturday, January 26, 2008

WINDOWS VISTA


Windows Vista (pronounced /ˈvɪstə/) is a line of operating systems developed by Microsoft for use on personal computers, including home and business desktops, laptops, Tablet PCs, and media centers. Prior to its announcement on July 22, 2005, Windows Vista was known by its codename "Longhorn". Development was completed on November 8, 2006; over the following three months it was released in stages to computer hardware and software manufacturers, business customers, and retail channels. On January 30, 2007, it was released worldwide to the general public, and was made available for purchase and downloading from Microsoft's web site. The release of Windows Vista comes more than five years after the introduction of its predecessor, Windows XP, making it the longest time span between two releases of Microsoft Windows.

Windows Vista contains hundreds of new and reworked features; some of the most significant include an updated graphical user interface and visual style dubbed Windows Aero, improved searching features, new multimedia creation tools such as Windows DVD Maker, and completely redesigned networking, audio, print, and display sub-systems. Vista also aims to increase the level of communication between machines on a home network using peer-to-peer technology in an effort to simplify sharing files and digital media between computers and devices. Windows Vista includes version 3.0 of the .NET Framework, which aims to make it significantly easier for developers to write applications than with the traditional Windows API.

Microsoft's primary stated objective with Windows Vista, however, has been to improve the state of security in the Windows operating system. One common criticism of Windows XP and its predecessors has been their commonly exploited security vulnerabilities and overall susceptibility to malware, viruses and buffer overflows. In light of this, Microsoft chairman Bill Gates announced in early 2002 a company-wide "Trustworthy Computing initiative" which aims to incorporate security work into every aspect of software development at the company. Microsoft stated that it prioritized improving the security of Windows XP and Windows Server 2003 above finishing Windows Vista, thus delaying its completion.

Windows Vista has received a number of negative assessments. PC World listed it #1 of "the 15 biggest tech disappointments of 2007," saying that "many users are clinging to XP like shipwrecked sailors to a life raft, while others who made the upgrade are switching back." Criticism targets include protracted development time, more restrictive licensing terms, the inclusion of a number of new digital rights management technologies aimed at restricting the copying of protected digital media, lack of device drivers for some hardware, and the usability of other new features such as User Account Control.

Microsoft began work on Windows Vista, known at the time by its codename "Longhorn" in May 2001, five months prior to the release of Windows XP. It was originally expected to ship sometime late in 2003 as a minor step between Windows XP and "Blackcomb", which was planned to be the company's next major operating system release. Gradually, "Longhorn" assimilated many of the important new features and technologies slated for "Blackcomb", resulting in the release date being pushed back several times. Many of Microsoft's developers were also re-tasked with improving the security of Windows XP and Windows Server 2003, both of which had been the target of a number of high-profile security lapses. Faced with ongoing delays and concerns about feature creep, Microsoft announced on August 27, 2004 that it had revised its plans. The original "Longhorn", based on the Windows XP source code, was scrapped, and Longhorn's development started anew, building on the Windows Server 2003 Service Pack 1 codebase, and re-incorporating only the features that would be intended for an actual operating system release. Some previously announced features such as WinFS were dropped or postponed, and a new software development methodology called the "Security Development Lifecycle" was incorporated in an effort to address concerns with the security of the Windows codebase.

After "Longhorn" was named Windows Vista in July 2005, an unprecedented beta-test program was started, involving hundreds of thousands of volunteers and companies. In September of that year, Microsoft started releasing regular Community Technology Previews (CTP) to beta testers. The first of these was distributed at the 2005 Microsoft Professional Developers Conference, and was subsequently released to beta testers and Microsoft Developer Network subscribers. The builds that followed incorporated most of the planned features for the final product, as well as a number of changes to the user interface, based largely on feedback from beta testers. Windows Vista was deemed feature-complete with the release of the "February CTP", released on February 22, 2006, and much of the remainder of work between that build and the final release of the product focused on stability, performance, application and driver compatibility, and documentation. Beta 2, released in late May, was the first build to be made available to the general public through Microsoft's Customer Preview Program. It was downloaded by over five million people. Two release candidates followed in September and October, both of which were made available to a large number of users.

While Microsoft had originally hoped to have the consumer versions of the operating system available worldwide in time for Christmas 2006, it was announced in March 2006 that the release date would be pushed back to January 2007, in order to give the company – and the hardware and software companies which Microsoft depends on for providing device drivers – additional time to prepare. Through much of 2006, analysts and bloggers had speculated that Windows Vista would be delayed further, owing to anti-trust concerns raised by the European Commission and South Korea, and due to a perceived lack of progress with the beta releases. However, with the November 8, 2006 announcement of the completion of Windows Vista, Microsoft's lengthiest operating system development project came to an end.

INTERNATIONAL BUSINESS MACHINES (IBM)


International Business Machines Corporation (abbreviated IBM, nicknamed "Big Blue"; NYSE: IBM) is a multinational computer technology and consulting corporation headquartered in Armonk, New York, USA. The company is one of the few information technology companies with a continuous history dating back to the 19th century. IBM manufactures and sells computer hardware and software, and offers infrastructure services, hosting services, and consulting services in areas ranging from mainframe computers to nanotechnology.

IBM has been known through most of its recent history as the world's largest computer company; with over 355,000 employees worldwide, IBM is the largest information technology employer in the world. It is also the most profitable, but in revenues it fell to second place behind Hewlett Packard in 2007. Since 1990, IBM's annual sales growth has trailed behind the US economic growth due to global deregulation and competition.

IBM holds more patents than any other U.S. based technology company. It has engineers and consultants in over 170 countries and IBM Research has eight laboratories worldwide. IBM employees have earned three Nobel Prizes, four Turing Awards, five National Medals of Technology, and five National Medals of Science. As a chip maker, IBM is among the Worldwide Top 20 Semiconductor Sales Leaders.

The company which became IBM was founded in 1888 as the Tabulating Machine Company by Herman Hollerith, in Broome County, New York. It was incorporated as Computing Tabulating Recording Corporation (CTR) on June 16, 1911, and was listed on the New York Stock Exchange in 1916. IBM adopted its current name in 1924, when it became a Fortune 500 company.

During the Second World War IBM played a role in the automation of concentration camps in Nazi Germany as can be read in Edwin Black's book IBM and the Holocaust.

In 1981 IBM introduced the IBM Personal Computer which is the original version and progenitor of the IBM PC compatible hardware platform.

IBM's PC division was bought by Chinese company Lenovo on May 1, 2005 for $655 million in cash and $600 million in Lenovo stock. On January 25, 2007, Ricoh announced purchase of IBM Printing Systems Division for $725 million and investment in 3-year joint venture to form a new Ricoh subsidiary, InfoPrint Solutions Company; Ricoh will own a 51% share, and IBM will own a 49% share in InfoPrint.

In 2003, IBM embarked on an ambitious project to rewrite company values. Using its Jam technology, the company hosted Intranet-based online discussions on key business issues with 50,000 employees over 3 days. The discussions were analyzed by sophisticated text analysis software (eClassifier) to mine online comments for themes. As a result of the 2003 Jam, the company values were updated to reflect three modern business, marketplace and employee views: "Dedication to every client's success", "Innovation that matters - for our company and for the world", "Trust and personal responsibility in all relationships".

In 2004, another Jam was conducted during which 52,000 employees exchanged best practices for 72 hours. They focused on finding actionable ideas to support implementation of the values previously identified. A new post-Jam Ratings event was developed to allow IBMers to select key ideas that support the values. The board of directors cited this Jam when awarding Palmisano a pay rise in the spring of 2005.

In July and September 2006, Palmisano launched another jam called InnovationJam. InnovationJam was the largest online brainstorming session ever with more than 150,000 participants from 104 countries. The participants were IBM employees, members of IBM employees' families, universities, partners, and customers. InnovationJam was divided in two sessions (one in July and one in September) for 72 hours each and generated more than 46,000 ideas. In November 2006, IBM declared that they will invest $US 100 million in the 10 best ideas from InnovationJam.

Thursday, January 24, 2008

INCREASING ALEXA RANK FOR DUMMIES

many are asking me about page ranks and Alexa ranking. I hope this will answer all your inquiries about Alexa and how improve your ranks in this site. This post is dedicated to all bloggers. The first step to increase Alexa Rankings, I made it for dummies. Just follow the steps and you will never go wrong. Have fun.

1. Install the Alexa toolbar or Firefox’s SearchStatus extension and set your blog as your homepage. This is the most basic step.

2. Put up an Alexa rank widget on your website. I did this a few days ago and receive a fair amount of clicks every day. According to some, each click counts as a visit even if the toolbar is not used by the visitor.

3. Encourage others to use the Alexa toolbar. This includes friends, fellow webmasters as well as site visitors/blog readers. Be sure to link to Alexa’s full explanation of their toolbar and tracking system so your readers know what installing the toolbar or extension entails.

4. Work in an Office or own a company? Get the Alexa toolbar or SS Firefox extension installed on all computers and set your website as the homepage for all browsers. Perhaps it will be useful to note that this may work only when dynamic or different IPs are used.

5. Get friends to review and rate your Alexa website profile. Not entirely sure of its impact on rankings but it might help in some way.

6. Write or Blog about Alexa. Webmaster and bloggers love to hear about ways to increase their Alexa rank. They’ll link to you and send you targeted traffic (i.e. visitors with the toolbar already installed). This gradually has effects on your Alexa ranking.

7. Flaunt your URL in webmaster forums. Webmasters usually have the toolbar installed. You’ll get webmasters to visit your website and offer useful feedback. It’s also a good way to give back to the community if you have useful articles to share with others.

8. Write content that is related to webmasters. This can fall in the category of domaining and SEO, two fields in which most webmasters will have the Alexa toolbar installed. Promote your content on social networking websites and webmaster forums.

9. Use Alexa redirects on your website URL. Try this: http://redirect.alexa.com/redirect? www.itsallaboutbrian.blogspot.com . Replace www.itsallaboutbrian.blogspot.blogspot.com with the URL for your website. Leave this redirected URL in blog comments as well as forum signatures. This redirect will count a unique IP address once a day so clicking it multiple times won’t help. There is no official proof that redirects positively benefit your Alexa Rank, so use with caution.

10. Post in Asian social networking websites or forums. Some webmasters have suggested that East Asian web users are big Alexa toolbar fans, judging by the presence of several Asia-based websites in the Alexa Top 500. I suggest trying this only if you have the time or capacity to do so.

11. Create a webmaster tools section on your website. This is a magnet for webmasters who will often revisit your website to gain access to the tools. Aaron Wall’s webpage on SEOTools is a very good example.

12. Get Dugg or Stumbled. This usually brings massive numbers of visitors to your website and the sheer amount will have a positive impact on your Alexa Rank. Naturally, you’ll need to develop link worthy material.

13. Use PayperClick Campaigns. Buying advertisements on search engines such as Google or Exact Seek will help bring in Traffic. Doubly useful when your ad is highly relevant to webmasters.

14. Create an Alexa category on your blog and use it to include any articles or news about Alexa. This acts as an easily accessible resource for webmasters or casual search visitors while helping you rank in the search engines.

15. Optimize your popular posts. Got a popular post that consistently receives traffic from the search engines? Include a widget/graph at the bottom of the post, link to your Alexa post or use Alexa redirection on your internal URLs.

16. Buy banners and links for traffic from webmaster forums and websites. A prominent and well displayed ad will drive lots of webmaster traffic to your website, which can significantly boost your rank.

17. Hire forum posters to pimp your website. Either buy signatures in webmaster forums or promote specific articles or material in your website on a regular basis. You can easily find posters for hire in Digital Point and other webmaster forums.

18. Pay Cybercafe owners to install the Alexa toolbar and set your website as the homepage for all their computers. This might be difficult to arrange and isn’t really a viable solution for most. I’m keeping this one in because some have suggested that it does work.

19. Use MySpace . This is a little shady so I don’t recommended it unless you’re really interested in artificially inflating your Alexa Rank. Use visually attractive pictures or banners and link them to your redirected Alexa URL. This will be most effective if your website has content that is actually relevant to the MySpace Crowd.

20. Try Alexa auto-surfs. Do they work? Maybe for brand new sites. I think they are mostly suitable for new websites with a very poor Alexa rank. Note that there be problems when you try to use auto surfs alongside contextual ads like Adsense. They aren’t also long term solutions to improving your Alexa Rank so I suggest using with caution.

Wednesday, January 23, 2008

ALEXA RANKINGS EXPLAINED


Alexa Internet, Inc. is a California-based subsidiary company of Amazon.com that is best known for operating a website that provides information on the web traffic to other websites.

Alexa Internet was founded in 1996 by Brewster Kahle and Bruce Gilliat. The company offered a toolbar that gave Internet users guidance on where to go next, based on the traffic patterns of its user community. Alexa also offered context for each site visited: to whom it was registered, how many pages it had, how many other sites pointed to it, and how frequently it was updated. Engineers at Alexa, in cooperation with the Internet Archive, created the Internet Archive's Wayback Machine.Alexa also supplies the Internet Archive with web crawls.

In 1999, Alexa was acquired by Amazon.com for about $250 million in Amazon stock.

The company's premises are in Building 37 of the Presidio of San Francisco.

Alexa began a partnership with Google in spring 2002, and with the Open Directory Project in January 2003. Live Search replaced Google as a provider of search results in May 2006. In September 2006, they began using their own Search Platform to serve results. In December 2006, they released Alexa Image Search. Built in-house, it is the first major application to be built on their Web Platform. Today, Alexa is primarily a search engine, an Open Directory-based web directory, and a supplier of site information.

Alexa also provides "site info" for the A9.com search engine.

In December 2005, Alexa opened its extensive search index and web-crawling facilities to third party programs through a comprehensive set of web services and APIs. These could be used, for instance, to construct vertical search engines that could run on Alexa's own servers or elsewhere. Uniquely, their Web Search Platform gives developers access to their raw crawl data. In May 2007, Alexa changed their API to require comparisons be limited to 3 sites, reduced size embedded graphs be shown using Flash, and mandatory embedded BritePic ads.

In April 2007, Alexa v. Hornbaker was filed to stop trademark infringement by the statsaholic service. In the lawsuit, Alexa alleges that Hornbaker is stealing traffic graphs for profit, and that the primary purpose of his site is to display graphs that are generated by Alexa's servers. Hornbaker removed the term Alexa from his service name on March 19, 2007.

Alexa Ranking is the basis of website traffic. The lower the Alexa rank, it indicates that many are visiting the site.

Monday, January 21, 2008

ATHLON PROCESSORS - AMD


ATHLON is the brand name applied to a series of different x86 processors designed and manufactured by AMD. The original Athlon, or Athlon Classic, was the first seventh-generation x86 processor and, in a first, retained the initial performance lead it had over Intel's competing processors for a significant period of time. AMD has continued the Athlon name with the Athlon 64, an eighth-generation processor featuring AMD64 (later renamed x86-64) technology.

The Athlon made its debut on June 23, 1999. The name was chosen by AMD as short for "decathlon". Athlon was the ancient Greek word for "Champion/trophy of the games".

AMD ex-CEO and founder Jerry Sanders developed strategic partnerships during the late 1990s to improve AMD's presence in the PC market based on the success of the K6 architecture. One major partnership announced in 1998 paired AMD with semiconductor giant Motorola. In the announcement, Sanders referred to the partnership as creating a "virtual gorilla" that would enable AMD to compete with Intel on fabrication capacity while limiting AMD's financial outlay for new facilities. This partnership also helped to co-develop copper-based semiconductor technology, which would become a cornerstone of the K7 production process.

In August 1999, AMD released the Athlon (K7) processor. Notably, the design team was led by Dirk Meyer, one of the lead engineers on the DEC Alpha project. Jerry Sanders had approached many of the engineering staff to work for AMD as DEC wound the project down, and brought in a near-complete team of engineering experts. The balance of the Athlon design team was comprised of AMD K5 and K6 veterans.

By working with Motorola, AMD was able to refine copper interconnect manufacturing to the production stage about one year before Intel. The revised process permitted 180-nanometer processor production. The accompanying die-shrink resulted in lower power consumption, permitting AMD to increase Athlon clock-speeds to the 1 gigahertz range. AMD found processor yields on the new process exceeded expectations, and delivered high speed chips in volume in March 2000.

Internally, the Athlon is a fully seventh generation x86 processor, the first of its kind. Like the AMD K5 and K6, the Athlon is a RISC microprocessor which decodes x86 instructions into its own internal instructions at runtime. The CPU is an out-of-order design, again like previous post-5x86 AMD CPUs. The Athlon utilizes the DEC Alpha EV6 bus architecture with double data rate (DDR) technology. This means that at 100 MHz the Athlon front side bus actually transfers at a rate similar to a 200 MHz single data rate bus (referred to as 200 MT/s), which was superior to the method used on Intel's Pentium III (with SDR bus speeds of 100 and 133 MHz).

AMD designed the CPU with more robust x86 instruction decoding capabilities than that of K6, to enhance its ability to keep more data in-flight at once. Athlon's CISC to RISC decoder triplet could potentially decode 6 x86 operations per clock, although this was somewhat unlikely in real-world use. The critical branch predictor unit, essential to keeping the pipeline busy, was enhanced compared to what was onboard the K6. Deeper pipelining with more stages allowed higher clock speeds to be attained.[4] Whereas the AMD K6-III+ topped out at 570 MHz due to its short pipeline, even when built on the 180 nm process, the Athlon was capable of going much higher.

AMD ended its long-time handicap with floating point x87 performance by designing an impressive super-pipelined, out-of-order, triple-issue floating point unit. Each of its 3 units were tailored to be able to calculate an optimal type of instructions with some redundancy. By having separate units, it was possible to operate on more than one floating point instruction at once. This FPU was a huge step forward for AMD. While the K6 FPU had looked anemic compared to the Intel P6 FPU, with Athlon this was no longer the case.

The 3DNow! floating point SIMD technology, again present, received some revisions and a name change to "Enhanced 3DNow!". Additions included DSP instructions and an implementation of the extended-MMX subset of Intel SSE.

CPU Caching onboard Athlon consisted of the typical two levels. Athlon was the first x86 processor with a 128 KiB split level 1 cache; a 2-way associative, later 16-way, cache separated into 2×64 KiB for data and instructions (Harvard architecture). This cache was double the size of K6's already large 2×32 KiB cache, and quadruple the size of Pentium II and III's 2×16 KiB L1 cache. The initial Athlon (Slot A, later renamed Athlon Classic) used 512 KiB of level 2 cache separate from the CPU, on the processor cartridge board, running at 50% to 33% of core speed. This was done because the 250 nm manufacturing processes was too large to allow for on-die cache while maintaining cost-effective die size. Later Athlon CPUs, afforded greater transistor budgets by smaller 180 nm and 130 nm process nodes, moved to on-die L2 cache at full CPU clock speed.

SEARCH ENGINE OPTIMIZATION GUIDE FOR YOU AND FOR ME

Search Engine Optimization (SEO)

Search engine optimization is the process of getting your site to rank as high as possible for the keyword phrases most relevant to your business. Let me explain briefly the best way for your blog or website gain more ranks in every search engine there is. This will help us understand more about search engine optimization or much known as SEO for all.

Before beginning a search engine optimization (SEO) project, it is important to understand the process involved in an effective SEO campaign. To that end, I have highlighted the five-step process and described the activities involved in each of these steps.

1. Base Line Reporting
The first step in any effective SEO campaign is understanding what the current site's starting position is within the search engines. Doing so ensures that you know the specific areas that need work and provides a baseline against which to gauge the subsequent campaign's success. We would use a program like WebPosition to analyze the site's starting position.

In addition, access to site traffic information is very important. We will analyze the current site traffic information that might be available to attempt to discover which search engines and what keyword phrases are being used.

2. Keyword Research
We propose to meet with you and identify a group of 10-12 keyword phrases that will be used in this search engine optimization. This step is critical and requires a considerable amount of time to find a good set of phrases that offer a balanced combination of two important factors: high usage by searchers and relatively low competition within the search engines.

3. Page Optimization and Content Development
Page optimization and content development are critical to search engine success. Content is king in search engine optimization. The search engines love text; high volume, high-quality content related to your business will serve you in a couple of important ways.

First, a site loaded with high-quality content of interest to site users will give them a reason to stay and a reason to come back. After all, the reason they came to your site was to find information. Second, you will receive the added benefit of serving up exactly what the search engines want - content. Search engines will have more information to store about your business and products; that information will translate directly into the ranking they give your site for related keyword phrases.

4. Submission
The submission process involves manually submitting your site to a few select web sites (Yahoo and Google, for example) and also placing a link to the site on other sites that the search engines like Google visit regularly. When the search engine visits these other sites, it would grab your site link in the process. A higher value is placed on that link than on a manual submission done by the site owner.

5. Follow Up Reporting and Analysis
The same reporting done in the first step is done again at 1 month, 3 months and 4 months intervals, post-optimization. Rankings and site traffic can then be compared to pre-optimization levels, giving measurable results to the SEO campaign.

Sunday, January 20, 2008

MICROSOFT


Microsoft Corporation (NASDAQ: MSFT) (SEHK: 4338), or often just MS, is an American multinational computer technology corporation with 79,000 employees in 102 countries and global annual revenue of US $51.12 billion as of 2007. It develops, manufactures, licenses and supports a wide range of software products for computing devices. Headquartered in Redmond, Washington, USA, its best selling products are the Microsoft Windows operating system and the Microsoft Office suite of productivity software. These products have prominent positions in the desktop computer market, with market share estimates as high as 90% or more as of 2003 for Microsoft Office and 2006 for Microsoft Windows. One of Bill Gates' key visions is "to get a workstation running our software onto every desk and eventually in every home".

Founded to develop and sell BASIC interpreters for the Altair 8800, Microsoft rose to dominate the home computer operating system market with MS-DOS in the mid-1980s. The company released an initial public offering (IPO) in the stock market, which, due to the ensuing rise of the stock price, has made four billionaires and an estimated 12,000 millionaires from Microsoft employees. Throughout its history the company has been the target of criticism for various reasons, including monopolistic business practices—both the U.S. Justice Department and the European Commission, among others, brought Microsoft to court for antitrust violations and software bundling.

Microsoft has footholds in other markets besides operating systems and office suites, with assets such as the MSNBC cable television network, the MSN Internet portal, and the Microsoft Encarta multimedia encyclopedia. The company also markets both computer hardware products such as the Microsoft mouse and home entertainment products such as the Xbox, Xbox 360, Zune and MSN TV. Known for what is generally described as a developer-centric business culture, Microsoft has historically given customer support over Usenet newsgroups and the World Wide Web, and awards Microsoft MVP status to volunteers who are deemed helpful in assisting the company's customers. The company's official website is one of the most visited on the Internet, receiving more than 2.4 million unique page views per day according to Alexa.com, who ranked the site 18th amongst all websites for traffic rank on September 12, 2007.

APPLE INCORPORATED


Apple Inc. (NASDAQ: AAPL, LSE: 0HDZ, FWB: APC), formerly Apple Computer Inc., is an American multinational corporation with a focus on designing and manufacturing consumer electronics and closely related software products. Established in Cupertino, California on April 1, 1976, Apple develops, sells, and supports a series of personal computers, portable media players, mobile phones, computer software, and computer hardware and hardware accessories. As of September 2007, the company operates about 200 retail stores in five countries, and an online store where hardware and software products are sold. The iTunes Store provides music, audiobooks, iPod games, music videos, episodes of television programs, and movies which can be downloaded using iTunes on Mac or Windows, and also on the iPod touch and the iPhone. The company's best-known hardware products include the Macintosh line of personal computers, the iPod line of portable media players, and the iPhone. Apple's software products include the Mac OS X operating system, the iLife suite of multimedia and creativity software, and Final Cut Studio, a suite of professional audio- and film-industry software products.

The company, incorporated January 3, 1977, was known as "Apple Computer, Inc." for its first 30 years. On January 9, 2007, the company dropped "Computer" from its corporate name, reflecting the company's ongoing expansion into the consumer electronics market in addition to its traditional focus on personal computers.

Apple employs over 20,000 permanent and temporary workers worldwide[2] and had worldwide annual sales in its fiscal year 2007 (ending September 29, 2007) of US$24.01 billion.

For a variety of reasons, ranging from its philosophy of comprehensive aesthetic design to their advertising campaigns, Apple has engendered a distinct reputation in the consumer electronics industry and has cultivated a customer base that is unusually devoted to the company and its brand, particularly in the United States.

INTEL CORE 2


The Core 2 brand refers to a range of Intel's consumer 64-bit dual-core and MCM quad-core CPUs with the x86-64 instruction set, and based on the Intel Core microarchitecture, which derived from the 32-bit dual-core Yonah laptop processor. (Note: The Yonah had two interconnected cores, similar to those branded Pentium M, but comprising a single silicon chip or die.) The 2x2 MCM dual-die quad-core CPU had two separate dual-core dies (CPUs) - next to each other - in one quad-core MCM package. The Core 2 relegated the Pentium brand to a lower-end market, and reunified the laptop and desktop CPU lines divided into the Pentium 4, D, and M brands.

The Core microarchitecture returned to lower clock speeds and improved processors' usage of both available clock cycles and power compared with preceding NetBurst of the Pentium 4/D branded CPUs. It translates into more efficient decoding stages, execution units, caches, and buses, etc, reducing the power consumption of Core 2 branded CPUs, while enhancing their processing capacity.

The Core 2 brand was introduced on July 27, 2006 comprising of the Solo (single-core), Duo (dual-core), Quad (quad-core), and Extreme (dual- or quad-core CPUs for enthusiasts) branches, during 2007.

The Core 2 branded CPUs include: "Conroe" and "Allendale" (dual-core for higher- and lower-end desktops), "Merom" (dual-core for laptops), "Kentsfield" (quad-core for desktops), and their variants named "Penryn" (dual-core for laptops), "Wolfdale" (dual-core for desktops) and "Yorkfield" (quad-core for desktops). (Note: For the server and workstation "Woodcrest", "Clovertown", and "Tigerton" CPUs see the Xeon brand.)

The Core 2 branded processors featured the Virtualization Technology (except T52x0, T5300, T54x0, T5500 with stepping "B2", E2xx0 and E4x00 models), Execute Disable Bit, and SSE3. Their Core microarchitecture introduced also SSSE3, Trusted Execution Technology, Enhanced SpeedStep, and Active Management Technology (iAMT2). With a Thermal Design Power (TDP) of up to only 65 W, the Core 2 dual-core Conroe consumed only half the power of less capable, but also dual-core Pentium D-branded desktop chips with a TDP of up to 130 W (a high TDP requires additional cooling that can be noisy or expensive).

Typically for CPUs, the Core 2 Duo E4000/E6000, Core 2 Quad Q6600, Core 2 Extreme dual-core X6800, and quad-core QX6700 and QX6800 CPUs were affected by minor bugs.

The Core 2 Memory management unit (MMU) in X6800, E6000 and E4000 processors does not operate to previous specifications or implemented in previous generations of x86 hardware. This may cause problems, many of them serious security and stability issues, with existing operating system software. Intel's documentation states that their programming manuals will be updated "in the coming months" with information on recommended methods of managing the Translation Lookaside Buffer (TLB) for Core 2 to avoid issues, and admits that, "in rare instances, improper TLB invalidation may result in unpredictable system behavior, e.g. hangs or incorrect data."

Among the issues noted:

* Write-protect or non-execute bit for a page table entry is ignored.
* Floating point instruction non-coherencies.
* Allowed memory corruptions outside of the range of permitted writing for a process by running common instruction sequences.

Intel errata Ax39, Ax43, Ax65, Ax79, Ax90, Ax99 are said to be particularly serious. 39, 43, 79, which can cause unpredictable behavior or system hang, have been fixed in recent steppings.

Among those who have noted the errata to be particularly serious are OpenBSD's Theo de Raadt[1] and DragonFly BSD's Matthew Dillon. Taking a contrasting view was Linus Torvalds, calling the TLB issue "totally insignificant", adding, "The biggest problem is that Intel should just have documented the TLB behavior better."

Thursday, January 17, 2008

RSS FEED - ATOM


A web feed is a data format used for providing users with frequently updated content. Content distributors syndicate a web feed, thereby allowing users to subscribe to it. Making a collection of web feeds accessible in one spot is known as aggregation, which is performed by an Internet aggregator. A web feed is also sometimes referred to as a syndicated feed.

In the typical scenario of using web feeds, a content provider publishes a feed link on their site which end users can register with an aggregator program (also called a feed reader or a news reader) running on their own machines; doing this is usually as simple as dragging the link from the web browser to the aggregator. When instructed, the aggregator asks all the servers in its feed list if they have new content; if so, the aggregator either makes a note of the new content or downloads it. Aggregators can be scheduled to check for new content periodically.

The kinds of content delivered by a web feed are typically HTML (webpage content) or links to webpages and other kinds of digital media. Often when websites provide web feeds to notify users of content updates, they only include summaries in the web feed rather than the full content itself.

Web feeds are operated by many news websites, weblogs, schools, and podcasters.

Web feeds have some advantages compared to receiving frequently published content via email:

* When subscribing to a feed, users do not disclose their email address, so users are not increasing their exposure to threats associated with email: spam, viruses, phishing, and identity theft.
* If users want to stop receiving news, they do not have to send an "unsubscribe" request; users can simply remove the feed from their aggregator.
* The feed items are automatically "sorted" in the sense that each feed URL has its own sets of entries (unlike an email box, where all mails are in one big pile and email programs have to resort to complicated rules and pattern matching).

A "Feed Reader" is required for using Web Feeds. This tool works like an automated e-mail program, but no e-mail address is needed. The user subscribes to a particular web feed, and thereafter receives updated contents, every time updating takes place. Feed Readers may be online (like a webmail account) or offline. Recently a number of mobile readers have arrived to the market. An offline web feed is downloaded to the user's system. Feed readers are used in personalized home page services like iGoogle or My Yahoo or My MSN to put content such as news, weather and stock quotes appear on the user’s personal page. Content from other sites can also be added to that personalized page, again using feeds. Organizations can use a web feed server behind their firewall to distribute, manage and track the use of internal and external web feeds by users and groups. Other web-based tools are primarily dedicated to feed-reading only. One of the most popular web-based feed readers at this point is Bloglines, which is also free. Safari, Firefox, Internet Explorer 7.0, and many other web browsers allow receipts of feeds from the tool bar using Live Bookmarks, Favorites, and other techniques to integrate feed reading into a browser. Finally, there are desktop-based feed readers, e.g. Newsgator, FeedDemon, NetNewsWire, Outlook 2007 and Windows Live Mail.

APPLE MACBOOK DID IT AGAIN


This new Apple MacBook laptop was first presented at the Apple MacWorld Conference. It wwas presented by Mr. Steve Job, Chief Executive Officer of Apple. This new laptop busted all other laptop in size and features. This MacBook is already Wi-fi ready and has a storage capacity of 80 gig hard disk space. And the size is less than an inch. It was inserted inside an envelope before it was disclosed to the participants of the conference. Super slim laptop.

Monday, January 14, 2008

BLACK HAT TECHNIQUE

Black Hat search engine optimization is customarily defined as techniques that are used to get higher search rankings in an unethical manner. These black hat SEO techniques usually include one or more of the following characteristics:

* breaks search engine rules and regulations
* creates a poor user experience directly because of the black hat SEO techniques utilized on the Web site
* unethically presents content in a different visual or non-visual way to search engine spiders and search engine users.

A lot of what is known as black hat SEO actually used to be legit, but some folks went a bit overboard and now these techniques are frowned upon by the general SEO community at large. These black hat SEO practices will actually provide short-term gains in terms of rankings, but if you are discovered utilizing these spammy techniques on your Web site, you run the risk of being penalized by search engines.

Black hat SEO basically is a short-sighted solution to a long-term problem, which is creating a Web site that provides both a great user experience and all that goes with that.

CHINA MOBILE ENDS TALK WITH APPLE

China Mobile Ltd., the world's biggest wireless-phone company by users, said it ended talks to sell Apple Inc.'s iPhone handset in China, declining to say why the discussions were discontinued.

The negotiations ended because Apple wanted a larger share of revenue from game, music and video downloads than China Mobile would offer, the Sina.com Web site reported today. Apple shares rose the most in more than a year in November after China Mobile Chairman Wang Jianzhou said the Beijing-based carrier was in talks to sell the IPhone.

Now that the talks is over many expect the Apple IPhones will find it hard to enter the mobile market of China. Without China Mobile Ltd, Apple entering China won't be substantial.

Saturday, January 12, 2008

RSS FEED

A feed is a regularly updated summary of web content, along with links to full versions of that content. When you subscribe to a given website's feed by using a feed reader, you'll receive a summary of new content from that website. Important: you must use a feed reader in order to subscribe to website feeds. When you click on an RSS or Atom feed link, your browser may display a page of unformatted gobbledygook.

PAGE RANK MADE EASY

Google PageRank Explained
How many links do you need to get a certain pagerank?
PageRank as explained by Google
PageRank Technology

Factors that can increase your Google PageRank

Now the Google PageRank algorithm can be very complexed, but yet friendly invention. Here is a list of things that could help boost you Google PageRank, with a rating scale beside it of how important we think it is.

* Update Pages Frequently 2/10
* Add Pages Frequently 4/10
* Good Neighborhood Directories with high PageRank Levels 7/10
* Monster Websites 7/10
* Quality Inbound links 8/10
* Quality Relevant Links 9/10
* No Broken Links 5/10
* Article Submissions (this can increase your PageRank by getting more inbound links)
* All these put together 10/10

Factors that can decrease your Google PageRank

* Bad inbound links such as Poker, Porn, Sex, Drugs, or anything to that nature
* Link spamming
* Bad Content
* Lots of broken links
* SEO Black Hat Techniques

How Google PageRank is Calculated

Okay now we are to the Google PageRank Calculations, this is very simple so pay attention, I learned it over night, not really. The factors about Google PageRank Calculation is that no one knows exactly how it works, but people have discovered over time somewhat how it works, but lets get into an example.

PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

Above is a Google PageRank algo. that was released in the development of Google algo. Thats right the actualy algo equitation . Which google is not telling us what it does, but that does not matter because the equatation is good enough.

In the equation 't1 - tn' are pages linking to page A, 'C' is the number of outbound links that a page has and 'd' is a damping factor, usually set to 0.85.

A more simpler way to think of it is:

a page's PageRank = 0.15 + 0.85 * (a "share" of the PageRank of every page that links to it)

Share = The linking page's PageRank is divided by the number of outbound links

A page "votes" an amount of PageRank onto each page that it links to. The amount of PageRank that it has to vote with is a little less than its own PageRank value (its own value * 0.85). This value will be shared equally between all pages

Therefore; it would be better to get a page linked to you that has a PageRank of 5 with 2 outbound links then it would be to have a page linked to you with a PageRank of 8 with 500 outbound links, but don't get me wrong. It would be better to have both pages linked to you, but if you was to take your choice, think about it.

Now the Google PageRank algorithm is based between a pr of 1 to 10, but many people believe that numbers are set to a algo logarithmic scale. Which there is a very good reason to believe this, but no one knows for sure outside of google, now there has probably been people that have figured it out somewhat, but to be sure, you would of had to write the Google PageRank algorithm your self.
Who invented the Google PageRank Algorithm

Google PageRank was developed at Standford University by Larry Page, and Sergery Brin. This was part of a research project for these two individuals. The project was started in 1995 and then was led to a functional prototype. In 1998, Google was founded.

Friday, January 11, 2008

THE NEXT BEST THING

This is my personal opinion for this year in terms of technology and stuff alike.

This year will be a big bang for companies that are engaged into gadgets and softwares. From digital cameras, cellphones, laptop, PC. the gadgets that you bought last year will be absolutely obsolete this year. If you are planning to sell it, better sell it now when the prices are still high.

Microsoft will still be leading the way in terms of softwares. The Vista hype is still be in check in terms of sales. As I may say it will be an Alien Technology for next year. Apple will still be very in demand after the release of their iphone. Google and Yahoo will still be head on race on who is dominating the World Wide Web.

For those people who loves cars, then this year will be best for you. Toyota, Mitsubishi, Isuzu, and Suzuki are all Japanese brands will still dominate the Asian automobile market. New designs are now on ship to the nearest car dealers. Better check them out.

These are just some of my personal predictions for this year. But more is coming and I will be there to deliver these amazing information for all.


Tuesday, January 8, 2008

PAGE RANK FOR BLOGS

I guess it is no secret that most bloggers out there are now wanting to increase their page ranks. There are also others that were penalized by Google.

Since we are talking about page rank I would suggest that all bloggers must follow these guidelines in order to gain the page ranks corresponding to our blogs.

1. Join forums, forums are a great way to achieve links to your website. In most forums you are allowed to have a signature and in your signature you can put a link to your website. But another important note to look on is making sure the forum is somewhat related to your website. You will still get credit if it's not, but if it's related to your website than you will be accomplishing two tasks at once.

You will be advertising for your website (bringing in targeted traffic) You will also be building your websites presence.

Your websites presence is very important to your survival. The more people see, or hear about your website the more credibility you will have and this increases your chances of having these visitors come back and possibly become leads.

2. Submit to search engine directories. Search engine directories are a good way to get a free link to your website. They also increase your chances at being listed higher on popular search engines like Google, and overture.

Most search engine directories allow you to submit to their website for free. This will allow you to increase your web presence by being listed on another search engine, and it will also be a free link.

Remember the more links you have the higher your PR will be

3. Using ezine ads (or newsletters). Creating an ezine will probably be the most beneficial step you can take to increasing your web presence. When you create an ezine you will be able to keep visitors coming back to your website for more by using signatures and giving special deals.

Ezine's will also allow you to increase your back links. By creating an ezine you can submit your information about your ezine to an ezine directory. This directory will than link to your website(thus giving you a free link).

4. Creating and publishing articles. Articles are an easy source of generating new traffic. You can include your signature in your article. This will bring in more traffic from article submission directories.

Your signature usually consists of 4 to 8 lines. Usually the first line would be the title of the website that you are trying to advertise. The last line would be the link to the website and the lines in between these would be a sales pitch to draw your viewers into your website.

5. Links from related websites. Gaining links from related websites can be one of the most frustrating tasks you can attempt.

They are very easy to find, but can be somewhat difficult to obtain links from.

To find related websites, all you have to do is go to a search engine... say Google... and type in your subject. Maybe your website is based on ford mustangs.

You go to Google and type in ford mustangs, than you look around for pages that are somewhat related to your website. After you have done this (which should be very easy) you have to contact them in some way to get your link posted on their website. This can be the most difficult task because a lot of webmasters ignore e-mail's from people requesting links because they don't see the importance of it at the time. Some other reasons could be that they are rarely online, or they delete spam mail and sometimes delete their important emails in the process.

Important note: When looking for link partners don't just link with websites that have a page rank of 4 or higher. Link with anyone and everyone you get a chance to. If you link to someone that has a page rank of zero, this will not hurt your page rank. It will only increase it because you are getting a link back to your website. Google doesn't look at your back links page ranks to determine what yours is going to be. It simply looks at how many back links you have.

So if Google one day decided to link to a website that was just created and this website has a page rank of 0 and has a domain that goes something like this: mywebsite.geocities.com it's page rank wouldn't increase even though Google's page rank is 10, it's rank would still be zero because it would only have that one back link.


Happy blogging guys!

Monday, January 7, 2008

YAHOO COUNTERS GOOGLE CELLPHONE PUSH

The race to the end line between Yahoo and Google continues. This is what Yahoo is planning to do to counter Google's mobile push. Yahoo unveils a new platforms for mobile Internet users. News of the plan, which was announced at the annual Consumer Electronics Show, comes two months after Google took the wraps off its Android mobile software, along with an alliance of handset makers who planned to use the technology.

With Yahoo's plan to create a new mobile platforms for Internet users of mobile phones, it is also at the point of race of Microsoft. Well expect more rival counters between these giants in the Internet industry. This will be like watching a boxing fight of super heavy weights in their category. Round 1 starts now...bell sounds!

MICROSOFT XBOX OWNERS

Microsoft said yesterday that owners of its Xbox 360 games console will be able to buy shows such as ``Desperate Housewives'' from Disney's ABC, and watch films including ``Rocky'' from MGM. The company will also develop NBC's Web site for the 2008 Olympic Games in Beijing, Chairman Bill Gates said.

Microsoft, the largest software maker is planning to create more softwares and gadgets for the Internet users. Expect more from Microsoft this year. Microsoft has shipped 100 million copies of Windows Vista, which went on sale broadly in January 2007. That's a faster growth rate than any previous version of Windows released by the Redmond, Washington-based company.

With this remark from the worlds largest software creator, 2008 will indeed be a blast for Microsoft and to all the users out there.

APPLE HIRES AVON CEO

The chief executive officer of Avon cosmetic products was elected by Apple to take a seat in their board of directors. Andrea Jung of Avon is now a member of the board.

Jung is now the eighth member of Apple's board; she's also the only female on the board. In announcing her appointment, Apple CEO Steve Jobs spoke about her strong leadership and marketing background.

In addition to sitting on Apple's board of directors, Jung is a member of the New York Presbyterian Hospital board of trustees, the board of directors of General Electric and the Catalyst board of directors.