Showing posts with label history. Show all posts

History Of Search Engines


During the early development of the web, there was a list of webservers edited by Tim Berners-Lee and hosted on the CERN webserver. One historical snapshot from 1992 remains.As more webservers went online the central list could not keep up. On the NCSA site new servers were announced under the title "What's New!"
 The name stands for "archive" without the "v". It was created in 1990 by Alan Emtage, Bill Heelan and J. Peter Deutsch, computer science students at McGill University in Montreal. The program downloaded the directory listings of all the files located on public anonymous FTP (File Transfer Protocol) sites, creating a searchable database of file names; however, Archie did not index the contents of these sites since the amount of data was so limited it could be readily searched manually.
Around 2000, Google's search engine rose to prominence.The company achieved better results for many searches with an innovation called PageRank. This iterative algorithm ranks web pages based on the number and PageRank of other web sites and pages that link there, on the premise that good or desirable pages are linked to more than others. Google also maintained a minimalist interface to its search engine. In contrast, many of its competitors embedded a search engine in a web portal.
By 2000, Yahoo! was providing search services based on Inktomi's search engine. Yahoo! acquired Inktomi in 2002, and Overture (which owned AlltheWeb and AltaVista) in 2003. Yahoo! switched to Google's search engine until 2004, when it launched its own search engine based on the combined technologies of its acquisitions.
Microsoft first launched MSN Search in the fall of 1998 using search results from Inktomi. In early 1999 the site began to display listings from Looksmart blended with results from Inktomi except for a short time in 1999 when results from AltaVista were used instead. In 2004, Microsoft began a transition to its own search technology, powered by its own web crawler (called msnbot).
Microsoft's rebranded search engine, Bing, was launched on June 1, 2009. On July 29, 2009, Yahoo! and Microsoft finalized a deal in which Yahoo! Search would be powered by Microsoft Bing technology.

History Of VB










In the beginning, there was BASIC and it was good. Really! I mean, really the beginning. And yes, really good. BASIC ("Beginner's All purpose Symbolic Instruction Code") was designed as a language to teach people how to program by Professors Kemeny and Kurtz at Dartmouth College w-a-a-a-y back in 1963. It was so successful that soon a lot of companies were using BASIC as their programming language of choice. In fact, BASIC was the very first PC language because Bill Gates and Paul Allen wrote a BASIC interpreter for the MITS Altair 8800, the computer most people accept as the first PC, in machine language.
Visual Basic, however, was created by Microsoft in 1991. The main reason for the first version of Visual Basic was to make it a lot faster and easier to write programs for the new, graphical Windows operating system. Before VB, Windows programs had to be written in C++. They were expensive and difficult to write and usually had a lot of bugs in them. VB changed all that.

What is Twitter?






Twitter is a real-time information network that connects you to the latest stories, ideas, opinions and news about what you find interesting. Simply find the accounts you find most compelling and follow the conversations.

Fifth Generation Of Computer

Fifth Generation - Present and Beyond: Artificial Intelligence


Fifth generation computing devices, based on artificial intelligence, are still in development,though there are some applications, such as voice recognition, that are being used today.
Artificial Intelligence is the branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. Artificial intelligence includes:
  • Games Playing: programming computers to play games such as chess and checkers

  • Expert Systems: programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms)

  • Natural Language: programming computers to understand natural human languages

  • Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains

  • Robotics: programming computers to see and hear and react to other sensory stimuli
Currently, no computers exhibit full artificial intelligence (that is, are able to simulate human behavior). The greatest advances have occurred in the field of games playing. The best computer chess programs are now capable of beating humans. In May,1997, an IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in a chess match.
In the area of robotics, computers are now widely used in assembly plants, but they are capable only of very limited tasks. Robots have great difficulty identifying objects based on appearance or feel, and they still move and handle objects clumsily.
Natural-language processing offers the greatest potential rewards because it would allow people to interact with computers without needing any specialized knowledge. You could simply walk up to a computer and talk to it. Unfortunately, programming computers to understand natural languages has proved to be more difficult than originally thought. Some rudimentary translation systems that translate from one human language to another are in existence, but they are not nearly as good as human translators.
There are also voice recognition systems that can convert spoken sounds into written words, but they do not understand what they are writing; they simply take dictation. Even these systems are quite limited -- you must speak slowly and distinctly.
In the early 1980s, expert systems were believed to represent the future of artificial intelligence and of computers in general. To date, however, they have not lived up to expectations. Many expert systems help human experts in such fields as medicine and engineering, but they are very expensive to produce and are helpful only in special situations.
Today, the hottest area of artificial intelligence is neural networks, which are proving successful in an umber of disciplines such as voice recognition and natural-language processing.
There are several programming languages that are known as AI languages because they are used almost exclusively for AI applications. The two most common are LISP and Prolog.

Fourth Generation Of Computer

Fourth Generation - 1971-Present: Microprocessors

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits we rebuilt onto a single silicon chip. A silicon chip that contains a CPU. In the world of personal computers,the terms microprocessor and CPU are used interchangeably. At the heart of all personal computers and most workstations sits a microprocessor. Microprocessors also control the logic of almost all digital devices, from clock radios to fuel-injection systems for automobiles.
Three basic characteristics differentiate microprocessors:
  • Instruction Set: The set of instructions that the microprocessor can execute.

  • Bandwidth: The number of bits processed in a single instruction.

  • Clock Speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute.
In both cases, the higher the value, the more powerful the CPU. For example, a 32-bit microprocessor that runs at 50MHz is more powerful than a 16-bitmicroprocessor that runs at 25MHz.
What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip.
Abbreviation of central processing unit, and pronounced as separate letters. The CPU is the brains of the computer. Sometimes referred to simply as the processor or central processor, the CPU is where most calculations take place. In terms of computing power,the CPU is the most important element of a computer system.
On large machines, CPUs require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single chip called a microprocessor.
Two typical components of a CPU are:
  • The arithmetic logic unit (ALU), which performs arithmetic and logical operations.

  • The control unit, which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI's, the mouse and handheld devices

Third Generation Of Computer

Third Generation - 1964-1971: Integrated Circuits

 

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
A nonmetallic chemical element in the carbon family of elements. Silicon - atomic symbol "Si" - is the second most abundant element in the earth's crust, surpassed only by oxygen. Silicon does not occur uncombined in nature. Sand and almost all rocks contain silicon combined with oxygen, forming silica. When silicon combines with other elements, such as iron, aluminum or potassium, a silicate is formed. Compounds of silicon also occur in the atmosphere, natural waters,many plants and in the bodies of some animals.
Silicon is the basic material used to make computer chips, transistors, silicon diodes and other electronic circuits and switching devices because its atomic structure makes the element an ideal semiconductor. Silicon is commonly doped, or mixed,with other elements, such as boron, phosphorous and arsenic, to alter its conductive properties.
A chip is a small piece of semi conducting material(usually silicon) on which an integrated circuit is embedded. A typical chip is less than ¼-square inches and can contain millions of electronic components(transistors). Computers consist of many chips placed on electronic boards called printed circuit boards. There are different types of chips. For example, CPU chips (also called microprocessors) contain an entire processing unit, whereas memory chips contain blank memory.
Semiconductor is a material that is neither a good conductor of electricity (like copper) nor a good insulator (like rubber). The most common semiconductor materials are silicon and germanium. These materials are then doped to create an excess or lack of electrons.
Computer chips, both for CPU and memory, are composed of semiconductor materials. Semiconductors make it possible to miniaturize electronic components, such as transistors. Not only does miniaturization mean that the components take up less space, it also means that they are faster and require less energy.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

2nd Generation Of Computer

Second Generation - 1956-1963: Transistors


Transistors replaced vacuum tubes and ushered in the second generation computer. Transistor is a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today's latest microprocessor contains tens of millions of microscopic transistors.
Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It's safe to say that without the invention of transistors, computing as we know it today would not be possible.
The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube,allowing computers to become smaller, faster, cheaper,more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages,which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.

History Of Acer









Acer has been around since the 1970s, though I remember the brand most from the 1990s, when I bought my first computer. It was one of the companies that attempted to add color and design to the personal computer, hiring Frog Designs to come up with a sleeker design in a day when most PCs were beige (remember bland beige?). Acer Aspires were available in the then-shocking colors of charcoal gray and emerald green (image below).
But then, Acer disappeared. Even as a tech reporter covering computers, I had almost forgotten what happened to the Taiwanese company. Twist on beige PCs in the 1990s
In 2001, Acer reorganized and split its two main product lines into two companies. Computers continued under Acer. Its monitors and other peripherals became BenQ, which is alive and well and has its U.S. headquarters in Irvine.
And then about two years ago, Acer started coming back. The Ferrari laptop caught my attention. But I haven’t paid close enough attention. Apparently in two years, Acer has gone from the nation’s No. 8 PC seller to No. 5, just behind Gateway. It’s U.S. market share in the second quarter, according to market researcher International Data Corp., grew to 5.2 percent. Not bad, considering Gateway’s was 5.6 percent.

Sony New CEO



Mr. Hirai may have been given the corner office suite at the Japanese entertainment and electronics group in April, but you wouldn’t know it by looking at his paycheck.
On Wednesday, after the company held its annual shareholders’ meeting, Sony Corp. Sony Corp. SNE -1.36% disclosed the pay for three of its corporate executives including Mr. Hirai. In the past fiscal year ended March, Mr. Hirai received a relatively modest ¥115.6 million ($1.45 million) in salary, stock options, and benefits, a 24% cut from his pay in the same period a year earlier — and only a quarter of the compensation received by the man he replaced. He was the lowest paid of the three executives whose pay was disclosed in the regulatory filing.
A Sony spokesman declined to comment on whether Mr. Hirai’s pay will increase in the current fiscal year, but he said top executives will receive a cut in base salary in the year to March 2013, although this will not affect performance bonuses.
Most of Sony’s top brass took a cut in pay to take responsibility for losses at the company. Former CEO Howard Stringer also received a cut in compensation to ¥449.5 million, down 48% from a year earlier. Nicole Seligman, Sony’s general counsel, saw her pay drop by 31% to ¥138.3 million.
The pay cuts reflect shareholder anger at the disappointing performance of the company’s shares. After four straight years in the red including a record annual loss in the past fiscal year, Sony’s shares are at historic lows.
Shares of the Japanese electronics and entertainment group are down nearly 50% from a year ago. It’s not merely a short-term decline either. The company’s stock price has tumbled more than 80% in the last five years.
“Sony’s business is in a very severe state. I am fully aware of this and I promise to change Sony and revive the company,” said Mr. Hirai at the meeting.
He pledged to change Sony by strengthening its core electronics division and fixing the problems at its television division. He also said he aims to restore the sense of wonder consumers experienced from Sony products in the past.
Shareholders didn’t hide their exasperation over the losses. One man questioned why Sony’s new board still included Mr. Stringer and Vice Chairman Ryoji Chubachi, lambasting the new management for not making a clean break from the past regime. Another said Mr. Stringer shouldn’t use the floods in Thailand or the Japanese earthquake and tsunami as excuses for the company’s poor performance.

Short List Of Sony Technologies







Sony has historically been notable for creating its own in-house standards for new recording and storage technologies, instead of adopting those of other manufacturers and standards bodies. The most infamous of these was the videotape format war of the early 1980s, when Sony marketed the Betamax system for video cassette recorders against the VHS format developed by JVC. Sony launched the Betamax videocassette recording format in 1975. In 1979 the Walkman brand was introduced, in the form of the world's first portable music player.n 1983 Sony launched the MSX, a home computer system, and introduced the world (with their counterpart Philips) to the Compact Disc (CD). In 1984 Sony launched the Discman series which extended their Walkman brand to portable CD products. In 1985 Sony launched their Handycam products and the Video8 format. Video8 and the follow-on hi-band Hi8 format became popular in the consumer camcorder market. In 1987 Sony launched the 4 mm DAT or Digital Audio Tape as a new digital audio tape standard.In 1986 they launched Write-Once optical discs (WO) and in 1988 launched Magneto-optical discs which were around 125MB size for the specific use of archival data storage.
 and many more till 2005.

Brief Intro: Of Sony Company




Sony Corporation , commonly referred to as Sony, is a Japanese multinational conglomerate corporation headquartered in Kōnan Minato, Tokyo, Japan. It ranked 87th on the 2012 list of Fortune Global 500.Sony is one of the leading manufacturers of electronics products for the consumer and professional markets.
Sony Corporation is the electronics business unit and the parent company of the Sony Group, which is engaged in business through its four operating segments – Electronics (including video games, network services and medical business), Motion pictures, Music and Financial Services. These make Sony one of the most comprehensive entertainment companies in the world. Sony's principal business operations include Sony Corporation (Sony Electronics in the U.S.), Sony Pictures Entertainment, Sony Computer Entertainment, Sony Music Entertainment, Sony Mobile Communications (formerly Sony Ericsson), and Sony Financial. Sony is among the Worldwide Top 20 Semiconductor Sales Leaders and third-largest television manufacturer in the world, after Samsung Electronics and LG Electronics.

History Of Sindh University






After the independence in 1947, the only functioning university in the newly founded nation of Pakistan was the University of Punjab providing services to the developed parts of the Punjab province. The area constituted as the Sindh province came under the academic coverage of University of Bombay which had now become a part of India.
A formal academic centre was therefore needed for Sindh and under the constitutional act no. XVII titled 'University of Sindh', a resolution was passed by the Legislative Assembly of Sindh thus giving birth to this new university in the nation's capital of Karachi. The act was subsequently revised and modified in 1961 and years to come. However, it was the act of 1972 that provided for greater autonomy and representation of teachers, under which the university currently functions.
In the years after the independence from 1947 to 1955, Hyderabad was declared as the capital of Sindh and the university operations were relocated from Karachi to Hyderabad in 1951 where it formally started functioning as a teaching institution in pursuit of fulfilment of its charter and mission to disseminate knowledge.

Sindh University






The University of Sindh (Sindhi: سنڌ يونيورسٽي) (Urdu: جامعه سندھ‎) informally known as Sindh University (abbreviated SU or USindh) is the second oldest university in Pakistan accredited by the Higher Education Commission of Pakistan.It is founded between 1947 to 1951. However, when it moved from Karachi to Hyderabad in 1951, it started functioning as a full-fledged teaching university. The university currently has affiliations with four law colleges, and 74 degree and post-graduate colleges throughout Sindh.

Samsung






 
Unlike other electronic companies Samsung origins were not involving electronics but other products.
In 1938 the Samsung's founder Byung-Chull Lee set up a trade export company in Korea, selling fish, vegetables, and fruit to China. Within a decade Samusng had flour mills and confectionary machines and became a co-operation in 1951. Humble beginnings.
From 1958 onwards Samsung began to expand into other industries such as financial, media, chemicals and ship building throughout the 1970's. In 1969, Samsung Electronics was established producing what Samsung is most famous for, Televisions, Mobile Phones (throughout 90's), Radio's, Computer components and other electronics devices.
1987 founder and chairman, Byung-Chull Lee passed away and Kun-Hee Lee took over as chairman. In the 1990's Samsung began to expand globally building factories in the US, Britain, Germany, Thailand, Mexico, Spain and China until 1997.
In 1997 nearly all Korean businesses shrunk in size and Samsung was no exception. They sold businesses to relieve debt and cut employees down lowering personnel by 50,000. But thanks to the electronic industry they managed to curb this and continue to grow.
The history of Samsung and mobile phones stretches back to over 10 years. In 1993 Samsung developed the 'lightest' mobile phone of its era. The SCH-800 and it was available on CDMA networks.
Then they developed smart phones and a phone combined mp3 player towards the end of the 20th century. To this date Samsung are dedicated to the 3G industry. Making video,camera phones at a speed to keep up with consumer demand. Samsung has made steady growth in the mobile industry and are currently second but competitor Nokia is ahead with more than 100% increase in shares.

Wikipedia




Wikipedia  is a free, collaboratively edited, and multilingual Internet encyclopedia supported by the non-profit Wikimedia Foundation. Its 23 million articles, over 4.1 million in the English Wikipedia alone, have been written collaboratively by volunteers around the world. Almost all of its articles can be edited by anyone with access to the site, and it has about 100,000 active contributors. As of November 2012, there are editions of Wikipedia in 285 languages. It has become the largest and most popular general reference work on the Internet, ranking sixth globally among all websites on Alexa and having an estimated 365 million readers worldwide. In 2011, Wikipedia received an estimated 2.7 billion monthly pageviews from the United States alone.
Wikipedia was launched on January 15, 2001 by Jimmy Wales and Larry Sanger. Sanger coined the name Wikipedia.which is a portmanteau of wiki (a type of collaborative website, from the Hawaiian word wiki, meaning "quick") and encyclopedia. Wikipedia's departure from the expert-driven style of encyclopedia building and the presence of a large body of unacademic content have received extensive attention in print media. In 2006, Time magazine recognized Wikipedia's participation in the rapid growth of online collaboration and interaction by millions of people around the world, in addition to YouTube, MySpace, and Facebook. Wikipedia has also been praised as a news source due to articles related to breaking news often being rapidly updated.

Dell



Michael Dell, in 1984 founded Dell in order to directly serve their customers with computers that meet their needs. The company was called PC’s Limited and he was still a student at University of Texas at the time. The following year, Dell came out with their very first computer called the Turbo, which had an eight-megahertz processor. The major goal was to produce personal computer systems that were IBM compatible and were produced or entirely stock parts.
What set the company apart was not just its consumer-oriented focus but also its allowance for people to customize their computers during the ordering process. Because each computer was individually assembled, this was possible.
The company grossed 73 million dollars in the first year.The company went public in 1988

HP Laptops

History of HP




Classmates Bill Hewlett, along with Dave Packard of Stanford University went ahead to find Hewlett Packard (HP) in the year 1939. The first product of the company, i.e. the audio oscillator, was manufactured in Palo Alto garage. It was one of the electronic instruments utilized by the sound engineers. Amongst the first customer of HP was 'Walt Disney Studios'. They purchased 8 oscillators for developing and testing a new-fangled sound system to make the film 'Fantasia'. The Hewlett Packard or HP Company is, at present, the topmost Corporation with respect to the information technology (IT industry). HP is the organization known for revolutionizing printing industry. Moreover, it's the pioneer in computer world. HP has, indeed, replaced Dell computers off late in the terms of computer shipments and sales.






Facebook







Facebook is the world’s largest social network, with over 1 billion monthly active users.
Facebook was founded by Mark Zuckerberg in February 2004, initially as an exclusive network for Harvard students. It was a huge hit: in 2 weeks, half of the schools in the Boston area began demanding a Facebook network. Zuckerberg immediately recruited his friends Dustin Moskovitz, Chris Hughes, and Eduardo Saverin to help build Facebook, and within four months, Facebook added 30 more college networks.

The original idea for the term Facebook came from Zuckerberg’s high school (Phillips Exeter Academy). The Exeter Face Book was passed around to every student as a way for students to get to know their classmates for the following year. It was a physical paper book until Zuckerberg brought it to the internet.
With this success, Zuckerberg, Moskowitz and Hughes moved out to Palo Alto for the summer and rented a sublet. A few weeks later, Zuckerberg ran into the former cofounder of Napster, Sean Parker. Parker soon moved in to Zuckerberg’s apartment and they began working together. Parker provided the introduction to their first investor, Peter Thiel, cofounder of PayPal and managing partner of The Founders Fund. Thiel invested $500,000 into Facebook.