A computer expert has avoided jail after being convicted of helping to send millions of spam messages.
Welcome to the most versatile blog when it comes to technology and the world. Everything tech, everywhere tech and everyone tech...
Friday, 29 January 2016
Spam text programmer spared jail
Wednesday, 27 January 2016
Vtech restores some online services after hack
Children's toy firm Vtech has reopened "key functions" of its Learning Lodge gateway for the first time since it was hacked in November 2015.
DDoS: Website-crippling cyber-attacks to rise in 2016
Cyber-attacks that bring down websites and online services have been getting bigger every year. But how will businesses cope in 2016?
Sunday, 24 January 2016
Dronecode Project Gets More Wind Beneath Its Wings
The foundation launched the open source project in 2014 in an effort to create a unified platform for commercial drone technology.
Microsoft Cloud Rains Free Services on Nonprofits
The company's recently formed Microsoft Philanthropies unit will manage the contribution as part of a three-part effort to provide cloud services to worthy nonprofits, boost access to university researchers, and increase last-mile Internet access to communities that lack basic services.
Sony Builds Smart Home Hub Into Ceiling Light
It includes sensors for motion, illumination, temperature and humidity. There's also the requisite WiFi radio, a memory card slot, a condenser microphone, and an infrared controller for air conditioners.
Facebook Opens Sports Stadium
Friday, 22 January 2016
Top 100 Technology Facts
The Fastest Network Cable Can Now Transfer 1 GB in 0.003 of a Second
World’s fastest network cable allow you download movie in seconds.
Direct Brain to Brain Connection was Successful
Are we able to read others minds?
Robot Beats Japanese Master Swordsman!
Tuesday, 19 January 2016
Minecraft to launch education edition
An "education edition" of Minecraft is to be launched by Microsoft.
The product will offer teachers new ways to use the world-building video game in a range of subjects.
Microsoft paid $2.5bn (£1.8bn) for Mojang, Minecraft's Swedish creator, in 2014. And late last year, it bought the four-year-old MinecraftEdu version of the game from Finland-based independent developer TeacherGaming.
Twitter outage gets the day off to a rough start
People around the globe were stymied for a while Tuesday as they tried to get onto Twitter, which acknowledged that something was "technically wrong."

Government urged to use Bitcoin-style digital ledgers
The UK's chief scientific adviser has urged the government to adopt the technology that powers crypto-currency Bitcoin to run various public services.
Bitcoins are powered by blockchains - digital ledgers that record information from Bitcoin transactions to DNA.
Sir Mark Walport has argued that they could be used by government departments as a more secure way of managing data.
They could be used to help with tax collection, benefits or the issuing of passports, he has said.
Young Workers Fear The Rise Of The Robots
Young people reveal fears of being made redundant by machines, as world leaders prepare to discuss 'Fourth Industrial Revolution'.
12:08, UK,Monday 18 January 2016
Youngters fear a rise of the machines could leave them redundant
Nearly half of young people fear they will be replaced by machines at work within a decade, according to survey.
The research by software provider Infosys, which also found many people believed their education failed to prepare them for their jobs, as a report by ICM Unlimited said labour market changes, including increasing use of robots and artificial intelligence, would result in the loss of 5.1 million jobs in five years.
Saturday, 16 January 2016
World's First Flying Taxi Drone Unveiled
The manned drone has eight propellers and can travel autonomously, meaning the passenger can just climb in and relax.
12:25, UK,Thursday 07 January 2016
The world's first flying drone taxi has been unveiled at the Consumer Electronics Show (CES) in Las Vegas.
The EHang vehicle is held aloft by eight propellers and can travel autonomously meaning the passenger can just climb in and relax.
On-board systems stop it from colliding with other aircraft and buildings, and an emergency system allows it to temporarily hover in one spot mid-air in case of an emergency on the ground.
EHang was founded in 2014 and has raised around $50m (£34m) in venture funding so far.
The passenger simply needs to enter their destination on boarding
Executives told attendees at the CES trade fair that it could be the future of personal transport.
It is able to carry a single passenger for 23 minutes at a speed of 60mph, and features gull-wing doors and arms that fold up.
EHang executives added that the drone can be fully charged in two hours.
Other than entering the destination, the passenger will have no other input into how the drone operates.
The company said this will make the machine safer by eliminating "the most dangerous part of standard modes of transportation, human error".
It is likely to be several years before the drone could be able to go into full production - and only if it can be approved by air safety authorities.
US military bosses want to build a PHOTON DETECTOR: Darpa issues open call to design a gadget that counts light particles.
Every object in the universe has a dual nature, according to quantum physics.
A photon, for instance, can act like a wave or a particle, depending on the situation it's in.
This means it's almost impossible to physically count these building blocks of light.
But that's exactly what the US military wants to do – and it has issued an open call for scientists and engineers to help it create an accurate photon detector.
A photon can act like a wave or a particle, depending on the situation it's in. This means it's almost impossible to physically count these building blocks of light. But that's exactly what the US military wants to do – and it has issued an open call for scientists and engineers to help it create an accurate photon detector
Under the Fundamental Limits of Photon Detection Program, Darpa wants a device that can put out individual photons, according to a report in Engadget.
This could help every system that uses light, from medical scanners to night vision in the army, and even self-driving cars.
'The goal of the Detect program is to determine how precisely we can spot individual photons,' said Prem Kumar, Darpa program manager.
'[We want to see] whether we can maximise key characteristics of photon detectors simultaneously in a single system,' said Prem Kumar, Darpa program manager.
'This is a fundamental research effort, but answers to these questions could radically change light detection as we know it and vastly improve the many tools and avenues of discovery that today rely on light detection.'
Photons in the visible range fill at the minimum a cubic micron of space, which might seem to make them easy to distinguish and to count.
The difficulty arises when light interacts with matter.
Under the Fundamental Limits of Photon Detection Program, Darpa wants a device that can put out individual photons, according to a report in Engadget. This could help every system that uses light, from medical scanners to night vision in the army, and even self-driving cars
A cubic micron of conventional photon-detection material has more than a trillion atoms, and the incoming light will interact with many of those atoms simultaneously.
That cloud of atoms has to be modelled quantum mechanically to conclude with precision that a photon was actually there.
And modelling at that massive scale hasn't been possible — that is, until recently.
'For decades we saw few significant advances in photon detection theory, but recent progress in the field of quantum information science has allowed us to model very large and complicated systems,' Kumar said.
Advances in nano-science have also been critical, he added.
'Nano-fabrication techniques have come a long way. Now not only can we model, but we can fabricate devices to test those models.'
The agency is holding an information day on January 25th in Arlington, Virginia, to reveal more about the project.
PHOTON BREAKTHROUGH COULD PAVE WAY FOR SUPERFAST COMPUTERS
A theoretical physicist has been able to explain how to capture particles of light at room temperature.
In doing so, Alex Kruchkov has confirmed the existence of a ‘new state of light’ which could pave the way for advanced computer chip, laser and solar panel technology
Previously, getting hold of these particles - called photons - was only thought to be possible under extremely cold temperatures.
Light is made up of tiny quantum particles called photons. Physicists know that when quantum particles condense, they lose their individual identity.
Their different energy levels collapse into a single macroscopic quantum state, causing them to behave like clones and form a ‘super particle’ or wave known as a Bose-Einstein condensate.
A long-standing question in physics has been whether or not photons can be condensed like other quantum particles such as rubidium atoms.
In theory it should work, but the main obstacle to proving the effect has always been that photons have no mass, which is a key requirement for a Bose-Einstein condensate.
Mr Kruchkov’s mathematical model shows that photons can be captured in a Bose-Einstein condensate state, and at room temperature and pressure.
Praying Mantises Given 3D Specs By Scientists
When the insects are shown simulations of their favourite food on a screen they only try to catch it if the video is shown in 3D.
16:08, UK,Thursday 07 January 2016
Mantises get their name because they look as if they are at prayer
Insects fitted with tiny 3D glasses are showing scientists how they can catch fast-moving camouflaged crickets despite having brains no bigger than a pin head.
The plane that can detach its entire CABIN in the event of an emergency: Concept can 'drop' passengers to safety using built-in parachutes .
Engineers have revealed a radical new way to save plane passengers in the event of an emergency.
The design shows an aircraft with a detachable cabin that releases in emergency situations.
During take-off, landing or flight, the redesigned cabin detaches from the plane and safely lands on the ground or water– saving everyone's lives on board.
Scroll down for video
Google's Reddit AMA about its new tablet just went horribly wrong
Google engineers who built the company's new Pixel C tablet did an "Ask Me Anything" chat on Reddit Thursday, and, well ... oy.
The engineers were faced with an onslaught of questions about why their tablet stinks. The vast majority of questions and comments were negative, and the Google team didn't exactly face those critiques head-on.
Wednesday, 13 January 2016
Google Calendar just got more helpful
Goodbye, post-it notes on fridge doors. Now, your to-do lists have a new home.
Starting this week, Google (GOOG) will finally allow you to add reminders to Google Calendar.
"Often our to-do list is elsewhere, separate from the Calendar that organizes our day, and we end up overcommitted or miss something important because we forgot to check our list," Google wrote on its official blog.
Microsoft unveils HoloLens Experience at its New York flagship store
Want to try out Microsoft's HoloLens? Get in line.
Demand to play around with Microsoft's new hologram headset is off the charts, so Microsoft has created a space in its new Fifth Avenue flagship store where people can come to experience it for themselves.
Monday, 11 January 2016
5 Biotech Stocks Ready To Crush The Market In 2016
After a volatile 2015, the year ahead marks an important period for the biotech industry. Thanks to new technologies, discovery and development are moving at a rapid pace. The NASDAQ Biotechnology Index and NYSE ARCA BIOTECH INDEX were up 10.6% and 10.4%, respectively, comparing favorably with the overall stock market which had a rather lackluster year with the S&P 500 being down 0.7%.
Importantly, for 2016, the fundamentals of the biotech sector still look strong.
What is 5G?
Just five years after the first 4G smartphone hit the market, the wireless industry is already preparing for 5G.
Each of the four nationwide cell phone carriers, as well as smartphone chipmakers and the major network equipment companies are working on developing 5G network technology for their customers.
There are many significant hurdles that all the industry players have to clear before you'll see a little 5G symbol next to the signal bars on your smartphone screen. For example, it has yet to be determined what 5G even means, let alone what it will look like and when it will get here.
181,000 drones registered with FAA in two weeks
The FAA says 181,000 drones and counting have been registered since December 21.
In December, the agency made it mandatory for hobbyists to register their drones before flight.
Apple buys AI company that can read your emotions
Apple wants to know how you're feeling. It has acquired an artificial intelligence company that uses facial recognition to read a person's emotions.
World's first cyborg wants to hack your body
Neil Harbisson is the world's first legally recognized cyborg. He has an antenna implanted into his skull that gives him access to something he was born without: the ability to perceive color.
In a world where technology is overwhelming our mental focus and social lives, Harbisson, 32, has a closer relationship with technology than even the most avid smartphone user.
Apple stock plunges below $100. Next stop: $65?
Expert opinions are all over the place.
Wall Street analysts -- the people who get paid to predict where stock prices are headed -- have forecasts for the stock ranging from $65 to $200.
The overwhelmingly majority of experts still say Apple (AAPL, Tech30)is a "strong buy." The average forecast is for the stock to hit $142 in the coming months, according to data compiled by TipRanks, a site that tracks and ranks analysts on their predictions.
Where the stock is going hinges on iPhone sales and its ability to innovate more.
Apple shares are tanking this week because of a report in Japan's Nikkei newspaper that Apple plans to slash its output of iPhone 6S and iPhone 6S Plus by 30%.
That comes on the heels of an Accenture report warning that demand for smartphones is waning. People don't believe the newer models are all that much better than what they currently have, so they don't want to pay for an upgrade.
The case for Apple tanking
"Given its huge reliance on excessively profitable iPhones, Apple, we believe, is bound to enter a phase of diminishing investment returns," said Per Lindberg, an analyst for ABG Sundal Collier, who predicts the stock will fall to $65.
He argues that a simple analysis shows Apple's earnings would be cut in half if it loses market share or has to reduces prices to remain competitive.
Lindberg isn't your typical Wall Street analyst. He's based in Sweden and is one of only three Apple researchers, out of the dozens, with a "sell" rating on the stock, according to TipRanks and FactSet.
His pessimism is backed up by the stock's recent performance. Apple hasn't closed below $100 a share since October 2014 (the stock dipped briefly below $100 on August 24, the day the Dow fell a historic 1,000 points at the open. But it recovered by the end of the day).
Apple is actually in a "bear market" -- the stock is down more than 25% since its all-time high in April.
The case for Apple rising
But Apple stock has many loyal fans -- much like its products. Apple is by far and away the most widely held stock in the investment portfolios of "regular Joe" investors. It is also the most looked up stock ticker on CNNMoney's site.
Over the years, Apple stock has made many people rich, and not just corporate executives.
Lindberg's views are refuted by many experts, including Gene Munster, a senior research analyst at Piper Jaffray.
He is the top ranked forecaster for Apple stock on TipRanks (meaning he has made the most accurate predictions), and he believes Apple stock is headed to $179.
Munster says the iPhone 6 scale back is already priced in. He notes that Apple has always met or exceeded its sales guidance for the past three years.
As recently as December, Apple CEO Tim Cook said iPhone sales would grow in 2016.
If Apple manages to beat the very gloomy expectations that Wall Street now has, the stock should bounce. It was already rebounding somewhat Friday, although it is still below $100.
"We believe... the iPhone franchise is healthy and will benefit over the next several years from the move to annual upgrade programs," wrote Munster in a note this week.
Saturday, 9 January 2016
ALGORITHMS YOU SHOULD KNOW AS A PROGRAMMER
TOP ALGORITHMS YOU SHOULD KNOW AS A PROGRAMMER
Top Algorithms
INTRODUCTION
Heyy...Watsup beautiful viewers.Here are the top algorithms you should know.
The first step towards an understanding of why the study and knowledge
of algorithms are so important is to define exactly what we mean by an
algorithm. According to the popular algorithms textbook Introduction to
Algorithms (Second Edition by Thomas H. Cormen, Charles E. Leiserson,
Ronald L. Rivest, Clifford Stein), "an algorithm is any well-defined
computational procedure that takes some value, or set of values, as
input and produces some value, or set of values as output." In other
words, algorithms are like road maps for accomplishing a given,
well-defined task. So, a chunk of code that calculates the terms of the
Fibonacci sequence is an implementation of a particular algorithm. Even a
simple function for adding two numbers is an algorithm in a sense,
albeit a simple one.
Some algorithms, like those that compute the Fibonacci sequences, are intuitive and may be innately embedded into our logical thinking and problem solving skills. However, for most of us, complex algorithms are best studied so we can use them as building blocks for more efficient logical problem solving in the future. In fact, you may be surprised to learn just how many complex algorithms people use every day when they check their e-mail or listen to music on their computers. This article will introduce some basic ideas related to the analysis of algorithms, and then put these into practice with a few examples illustrating why it is important to know about algorithms.
Runtime Analysis
One of the most important aspects of an algorithm is how fast it is. It is often easy to come up with an algorithm to solve a problem, but if the algorithm is too slow, it’s back to the drawing board. Since the exact speed of an algorithm depends on where the algorithm is run, as well as the exact details of its implementation, computer scientists typically talk about the runtime relative to the size of the input. For example, if the input consists of N integers, an algorithm might have a runtime proportional to N2, represented as O(N2). This means that if you were to run an implementation of the algorithm on your computer with an input of size N, it would take C*N2 seconds, where C is some constant that doesn’t change with the size of the input.
However, the execution time of many complex algorithms can vary due to factors other than the size of the input. For example, a sorting algorithm may run much faster when given a set of integers that are already sorted than it would when given the same set of integers in a random order. As a result, you often hear people talk about the worst-case runtime, or the average-case runtime. The worst-case runtime is how long it would take for the algorithm to run if it were given the most insidious of all possible inputs. The average-case runtime is the average of how long it would take the algorithm to run if it were given all possible inputs. Of the two, the worst-case is often easier to reason about, and therefore is more frequently used as a benchmark for a given algorithm. The process of determining the worst-case and average-case runtimes for a given algorithm can be tricky, since it is usually impossible to run an algorithm on all possible inputs. There are many good online resources that can help you in estimating these values.
Approximate completion time for algorithms, N = 100
O(Log(N)) 10-7 seconds
O(N) 10-6 seconds
O(N*Log(N)) 10-5 seconds
O(N2) 10-4 seconds
O(N6) 3 minutes
O(2N) 1014 years.
O(N!) 10142 years.
Sorting
Sorting provides a good example of an algorithm that is very frequently used by computer scientists. The simplest way to sort a group of items is to start by removing the smallest item from the group, and put it first. Then remove the next smallest, and put it next and so on. Unfortunately, this algorithm is O(N2), meaning that the amount of time it takes is proportional to the number of items squared. If you had to sort a billion things, this algorithm would take around 1018 operations. To put this in perspective, a desktop PC can do a little bit over 109 operations per second, and would take years to finish sorting a billion things this way.
Luckily, there are a number of better algorithms (quicksort, heapsort and mergesort, for example) that have been devised over the years, many of which have a runtime of O(N * Log(N)). This brings the number of operations required to sort a billion items down to a reasonable number that even a cheap desktop could perform. Instead of a billion squared operations (1018) these algorithms require only about 10 billion operations (1010), a factor of 100 million faster.
Shortest Path
Algorithms for finding the shortest path from one point to another have been researched for years. Applications abound, but lets keep things simple by saying we want to find the shortest path from point A to point B in a city with just a few streets and intersections. There are quite a few different algorithms that have been developed to solve such problems, all with different benefits and drawbacks. Before we delve into them though, lets consider how long a naive algorithm – one that tries every conceivable option – would take to run. If the algorithm considered every possible path from A to B (that didn’t go in circles), it would not finish in our lifetimes, even if A and B were both in a small town. The runtime of this algorithm is exponential in the size of the input, meaning that it is O(CN) for some C. Even for small values of C, CN becomes astronomical when N gets even moderately large.
One of the fastest algorithms for solving this problem has a runtime of O(E*V*Log(V)), where E is the number of road segments, and V is the number of intersections. To put this in perspective, the algorithm would take about 2 seconds to find the shortest path in a city with 10,000 intersections, and 20,000 road segments (there are usually about 2 road segments per intersection). The algorithm, known as Djikstra’s Algorithm, is fairly complex, and requires the use of a data structure known as a priority queue. In some applications, however, even this runtime is too slow (consider finding the shortest path from New York City to San Francisco – there are millions of intersections in the US), and programmers try to do better by using what are known as heuristics. A heuristic is an approximation of something that is relevant to the problem, and is often computed by an algorithm of its own. In the shortest path problem, for example, it is useful to know approximately how far a point is from the destination. Knowing this allows for the development of faster algorithms (such as A*, an algorithm that can sometimes run significantly faster than Djikstra’s algorithm) and so programmers come up with heuristics to approximate this value. Doing so does not always improve the runtime of the algorithm in the worst case, but it does make the algorithm faster in most real-world applications.
Approximate algorithms
Sometimes, however, even the most advanced algorithm, with the most advanced heuristics, on the fastest computers is too slow. In this case, sacrifices must be made that relate to the correctness of the result. Rather than trying to get the shortest path, a programmer might be satisfied to find a path that is at most 10% longer than the shortest path.
In fact, there are quite a few important problems for which the best-known algorithm that produces an optimal answer is insufficiently slow for most purposes. The most famous group of these problems is called NP, which stands for non-deterministic polynomial (don’t worry about what that means). When a problem is said to be NP-complete or NP-hard, it mean no one knows a good way to solve them optimally. Furthermore, if someone did figure out an efficient algorithm for one NP-complete problem, that algorithm would be applicable to all NP-complete problems.
A good example of an NP-hard problem is the famous traveling salesman problem. A salesman wants to visit N cities, and he knows how long it takes to get from each city to each other city. The question is "how fast can he visit all of the cities?" Since the fastest known algorithm for solving this problem is too slow – and many believe this will always be true – programmers look for sufficiently fast algorithms that give good, but not optimal solutions.
Random Algorithms
Yet another approach to some problems is to randomize an algorithm is some way. While doing so does not improve the algorithm in the worst case, it often makes very good algorithms in the average case. Quicksort is a good example of an algorithm where randomization is often used. In the worst case, quicksort sorts a group of items in O(N2), where N is the number of items. If randomization is incorporated into the algorithm, however, the chances of the worst case actually occurring become diminishingly small, and on average, quicksort has a runtime of O(N*Log(N)). Other algorithms guarantee a runtime of O(N*Log(N)), even in the worst case, but they are slower in the average case. Even though both algorithms have a runtime proportional to N*Log(N), quicksort has a smaller constant factor – that is it requires C*N*Log(N) operations, while other algorithms require more like 2*C*N*Log(N) operations.
Another algorithm that uses random numbers finds the median of a group of numbers with an average runtime of O(N). This is a significant improvement over sorting the numbers and taking the middle one, which takes O(N*Log(N)). Furthermore, while deterministic (non-random) algorithms exist for finding the median with a runtime of O(N), the random algorithm is attractively simple, and often faster than the deterministic algorithms.
The basic idea of the median algorithm is to pick one of the numbers in the group at random, and count how many of the numbers in the group are less than it. Lets say there are N numbers, and K of them are less than or equal to the number we picked at random. If K is less than half of N, then we know that the median is the (N/2-K) th number that is greater than the random number we picked, so we discard the K numbers less than or equal to the random number. Now, we want to find the (N/2-K) th smallest number, instead of the median. The algorithm is the same though, and we simply pick another number at random, and repeat the above steps.
Compression
Another class of algorithm deals with situations such as data compression. This type of algorithm does not have an expected output (like a sorting algorithm), but instead tries to optimize some other criteria. In the case of data compression, the algorithm (LZW, for instance) tries to make the data use as few bytes as possible, in such a way that it can be decompressed to its original form. In some cases, this type of algorithm will use the same techniques as other algorithms, resulting in output that is good, but potentially sub-optimal. JPG and MP3 compression, for example, both compress data in a way that makes the final result somewhat lower quality than the original, but they create much smaller files. MP3 compression does not retain every feature of the original song file, but it attempts to maintain enough of the details to capture most of the quality, while at the same time ensuring the significantly reduced file size that we all know and love. The JPG image file format follows the same principle, but the details are significantly different since the goal is image rather than audio compression.
The Importance of Knowing Algorithms
As a computer scientist, it is important to understand all of these types of algorithms so that one can use them properly. If you are working on an important piece of software, you will likely need to be able to estimate how fast it is going to run. Such an estimate will be less accurate without an understanding of runtime analysis. Furthermore, you need to understand the details of the algorithms involved so that you’ll be able to predict if there are special cases in which the software won’t work quickly, or if it will produce unacceptable results.
Of course, there are often times when you’ll run across a problem that has not been previously studied. In these cases, you have to come up with a new algorithm, or apply an old algorithm in a new way. The more you know about algorithms in this case, the better your chances are of finding a good way to solve the problem. In many cases, a new problem can be reduced to an old problem without too much effort, but you will need to have a fundamental understanding of the old problem in order to do this.
As an example of this, lets consider what a switch does on the Internet. A switch has N cables plugged into it, and receives packets of data coming in from the cables. The switch has to first analyze the packets, and then send them back out on the correct cables. A switch, like a computer, is run by a clock with discrete steps – the packets are send out at discrete intervals, rather than continuously. In a fast switch, we want to send out as many packets as possible during each interval so they don’t stack up and get dropped. The goal of the algorithm we want to develop is to send out as many packets as possible during each interval, and also to send them out so that the ones that arrived earlier get sent out earlier. In this case it turns out that an algorithm for a problem that is known as "stable matching" is directly applicable to our problem, though at first glance this relationship seems unlikely. Only through pre-existing algorithmic knowledge and understanding can such a relationship be discovered.
More Real-world Examples
Other examples of real-world problems with solutions requiring advanced algorithms abound. Almost everything that you do with a computer relies in some way on an algorithm that someone has worked very hard to figure out. Even the simplest application on a modern computer would not be possible without algorithms being utilized behind the scenes to manage memory and load data from the hard drive.
There are dozens of applications of complicated algorithms, but I’m going to discuss two problems that require the same skills as some past TopCoder problems. The first is known as the maximum flow problem, and the second is related to dynamic programming, a technique that often solves seemingly impossible problems in blazing speed.
Maximum Flow
The maximum flow problem has to do with determining the best way to get some sort of stuff from one place to another, through a network of some sort. In more concrete terms, the problem first arose in relation to the rail networks of the Soviet Union, during the 1950′s. The US wanted to know how quickly the Soviet Union could get supplies through its rail network to its satellite states in Eastern Europe.
In addition, the US wanted to know which rails it could destroy most easily to cut off the satellite states from the rest of the Soviet Union. It turned out that these two problems were closely related, and that solving the max flow problem also solves the min cut problem of figuring out the cheapest way to cut off the Soviet Union from its satellites.
The first efficient algorithm for finding the maximum flow was conceived by two Computer Scientists, named Ford and Fulkerson. The algorithm was subsequently named the Ford-Fulkerson algorithm, and is one of the more famous algorithms in computer science. In the last 50 years, a number of improvements have been made to the Ford-Fulkerson algorithm to make it faster, some of which are dauntingly complex.
Since the problem was first posed, many additional applications have been discovered. The algorithm has obvious relevance to the Internet, where getting as much data as possible from one point to another is important. It also comes up in many business settings, and is an important part of operations research. For example, if you have N employees and N jobs that need to be done, but not every employee can do every job, the max flow algorithm will tell you how to assign your N employees to jobs in such a way that every job gets done, provided that’s possible. Graduation, from SRM 200, is a good example of a TopCoder problem that lends itself to a solution using max flow.
Sequence comparison
Many coders go their entire careers without ever having to implement an algorithm that uses dynamic programming. However, dynamic programming pops up in a number of important algorithms. One algorithm that most programmers have probably used, even though they may not have known it, finds differences between two sequences. More specifically, it calculates the minimum number of insertions, deletions, and edits required to transform sequence A into sequence B.
For example, lets consider two sequences of letters, "AABAA" and "AAAB". To transform the first sequence into the second, the simplest thing to do is delete the B in the middle, and change the final A into a B. This algorithm has many applications, including some DNA problems and plagiarism detection. However, the form in which many programmers use it is when comparing two versions of the same source code file. If the elements of the sequence are lines in the file, then this algorithm can tell a programmer which lines of code were removed, which ones were inserted, and which ones were modified to get from one version to the next.
Without dynamic programming, we would have to consider a – you guessed it – exponential number of transformations to get from one sequence to the other. As it is, however, dynamic programming makes for an algorithm with a runtime of only O(N*M), where N and M are the numbers of elements in the two sequences.
Conclusion
The different algorithms that people study are as varied as the problems that they solve. However, chances are good that the problem you are trying to solve is similar to another problem in some respects. By developing a good understanding of a large range of algorithms, you will be able to choose the right one for a problem and apply it properly. Furthermore, solving problems like those found in TopCoder’s competitions will help you to hone your skills in this respect. Many of the problems, though they may not seem realistic, require the same set of algorithmic knowledge that comes up every day in the real world.geeksdimension.blogspot.com
Subscribe to:
Comments (Atom)