Among the various fields in which artificial intelligence is expected to provide business an advantage, the web continues to be a leading candidate. Although to date most companies have done an excellent job of gathering a great deal of information and organizing it in ways that have brought about advances in “big data,” what that information actually means is sometimes elusive.
From an AI standpoint, going from raw data to organized data is relatively easy compared to the much riskier leap of organized data to hypothesis and then to a conclusion. What starts out as a simple matter of pattern matching rapidly becomes an exponentially more difficult problem when meaning has to be reduced to logic.
That said, if an algorithm can identify patterns in data, it might be able to identify similar patterns in how that data is analyzed. If successful, such patterns could be standardized and possibly provide automated insight into any data set.
Such a development could portend revolutionary times for bloggers and other firms involved in web development.
The Next Step
Nearly 20 years ago, ISPs began making analytical tools available to webmasters so they could make sense of their traffic logs. It wasn’t long before those web development tools began to provide more than just raw numbers of “hits” and other information that didn’t necessarily reveal anything useful to the website owner.
While technology experts like David Lubbat would agree analysis isn’t intelligence, per se, the process of taking one set of data and drawing conclusions from it based on a standardized process is the first step in at least simulating an intelligent process. If computers are expected to do the kinds of things human minds do, they must become capable of dismissing those portions of the probability table that clearly don’t make sense.
For example, if a single visitor is recorded spending 41 hours on a site, it is relatively easy for a human mind to imagine all kinds of reasons that information can’t be taken at face value. An analytics program or an “intelligent” analysis application has to demonstrate at least that level of sophistication before it takes the next step.
What is clear is that the average time on site isn’t really several hours because one person mistakenly left their browser window open for a couple of days.
When it comes to a blogger deciding where to promote their work, the possibilities for artificially intelligent applications widen. Here, instead of comparing analytical data from one day or week to the next, an AI could be tasked with comparing one site’s data to the next. These are the kinds of applications prominent bloggers like David Lubbat require to build larger audiences.
What if, for example, an analytics application could determine, based on a set of standard criteria, if the audience for site A is similar enough to the audience for site B to make either author appealing on the opposite site? This kind of thing happens all the time in marketing agencies all over the world. Could it be automated and then applied to a series of ten, 100 or 1000 sites?
Up and Running
While some might argue this process is a simple matter of highly sophisticated pattern matching, an artificial intelligence scientist could credibly counter the human mind is simply a highly sophisticated pattern matching machine, in addition to its many other capabilities. There is no mechanism in the universe that can recognize and analyze a pattern faster than the human mind. The reason for this is rather simple.
Electronic circuits cannot take advantage of intuition. Microprocessors don’t have a subconscious either. A young woman has astonishing powers of perception and accurate analysis when it comes to a young man’s posture and facial expressions. It takes her a fraction of a second to process this information and reach her conclusion. Without putting too fine a point on it, she’s usually right.
A computer, on the other hand, would need to traverse a mind-boggling number of possible data combinations before it could even attempt drawing a conclusion. It also needs to perform this procedure each and every time. A young woman has her experiences to draw on.
The key to building an artificially intelligent application that can recognize the similarity in multiple data sets is to get it to imitate the young woman’s thinking. For bloggers, this ability would be translated as “that other site’s audience will like this blogger’s work because they are similar to this blogger’s audience in the following ways: pattern 1, pattern 2, etc.”
Once this hurdle is cleared, it would be a simple matter of presenting the application with two sets of analytical data and seeing if it concludes they are similar enough.
The key to improving this platform is built into the data sets. If it is properly constructed, the more information the application gathers, the more accurate its conclusions will become. Although compared to a human mind, this might be considered a laborious process, it is likely to improve rapidly.