Archive

Archive for the ‘Strategy’ Category

Multi-professional approach to Data Analytics

March 21, 2013 Leave a comment

Data analytics, Enterprise Intelligence, Continuous Assurance, Regression Analysis, Data Life Cycle are terms that you may hear when discussing potential approaches to addressing the Big Data Challenge. Unfortunately, the term “Big Data Challenge” is a misleading one, for it implies that there is only a single problem that needs solution, while in fact there is a number of unique circumstances that companies face, each requiring its own tailored approach. In this post I will aim to highlight the main areas of concern for Big Data specialists and some of the tools that have been developed to address these problems.

Before we begin, it is important to understand that a number of professions aim to fill the need for data analytics capability in business. Accountants, Actuaries, Internal Auditors, External Auditors, Statisticians, Mathematicians, Data warehouse specialists, Programmers, Risk Managers and Consultants, all of these professionals feel the need to contribute to the discussion. As you can imagine there is a great variety of problems faced and each profession has developed its own set of tools to cope with these challenges. Many of the professions struggle to adapt, in many cases statistical analysis has become more prominent, with Statisticians and Actuaries taking a lead and fewer professionals in the accounting field or consulting having the necessary skills. In other cases, professions come into conflict , with some professionals feeling that their domain is being taken over. As such, there is no single way to distinguish underlying domains of the Big Data Challenge, but I will try to do my best to reconcile various views.

What is the Big Data Challenge?

Most commonly, Big Data is described as a an explosion in the amount or frequency of data generated by modern enterprises. However, this is not a useful definition for it describes only the fact of occurrence and not repercussions of such a change.

I would postulate that this data explosion affects us in the following ways:

1. It is harder to find relevant information now, than when data was less abundant, because we need to dedicate more resources to searching.

2. It is harder to ensure consistency and compatibility of records, than when data was less abundant, because there are more ways in which data is collected.

3. It is harder to detect meaningful patterns within the data, than when data was less abundant, because the volume and speed of transactions require additional processing capabilities.

What solutions are out there?

As you can imagine, each organisation has its unique challenges, each challenge has several solutions, depending on the type of data, urgency, market conditions, and even people involved. As such, it is very difficult to create discrete rules that would classify each type of problem and advise a particular solution. This framework is aimed to be a rough guide, rather than a prescription.

1. Getting data warehouses in order and enabling easier access

Believe it or not, but data storage, data accuracy and ease of data access have been a topic of discussion in the computer science profession for decades. Database structure has had a considerable evolutionary history over the past 50 years. In short, databases became quicker, more tolerant to errors and more flexible. Unfortunately, not all organisations have cutting edge databases. A great variety of legacy systems and improper ways of using existing systems introduce number of errors into the datasets, errors that need to be remedied if further analysis is to take place. The explosion in data volumes exacerbated the situation by placing additional volume strain, as well as accuracy and operational requirements (as, for example, is the case for distributed databases). A number of new and established firms responded in a variety of ways to this challenge, either by developing new database technologies or by dedicating more processing and accuracy verification resources. This area has traditionally been addressed by IT professional.

Further reading on this topic can be found here.

2. More advanced and specialised search engines

In a way mini Big Data problems have been around for centuries. When the first printing press was invented, an explosion in print media warranted creation of libraries and subsequent catalog systems. Similar experience gave birth to phonebooks. And Google, in its brilliance, brought order to the informational chaos of the early Internet. Since then several new technologies emerged in order to tackle the challenge of finding the correct piece of information within a cluster of related data. Examples of companies involved in this field include IBM (and its famous Watson Computer), Sinequa (with its unified information access tools), and Recommind (with automatic categorisation tools), just to name a few. Each approach uses different underlying technologies, and if your Big Data problem falls into Search Engine category, you need to do additional research to understand which technology would work best in your circumstances.

3. Pattern recognition and detection – new and old data analysis techniques

Another domain of Big Data is the need (or an opportunity?) to detect patterns within the data with a view of making forward-looking predictions or to detect anomalies. A range of situations where this capability might be useful is virtually limitless and varies from pricing, customer management and production planning to fraud detection and equipment monitoring. However, methods that address this issue fall into three main categories.

First method is data visualisation. This method is very intuitive and appealing, since we can perceive visual information very rapidly. Majority of data visualisation techniques focus around enabling rapid prototyping of visual models, some examples can be found here. These techniques allow to pinpoint outliers and trends, but rely heavily on personal interpretation. Additionally, not all phenomena can be expressed in a visual way, with some patterns taking form of multi-dimensional multi-order relationships. Furthermore, a great deal of training and experience is needed for these visual models to be used correctly. Human brain excels at finding visual patterns, however in some cases it is susceptible to finding false positives, Astrology being one example.

Second method is mathematical modelling. This approach leverages a number of well-known statistical techniques starting from various types of regression and drawing heavily on differential equations. This approach has proven to be effective in a number of applications, such as its integration with ERP systems. However, it is very expensive and complex to implement. The level of mathematical expertise required and specialised nature of the models often restrict application of this method to a high value and high impact projects. Furthermore, most models of this type have limited dynamic flexibility, and if underlying relationships change the model becomes obsolete. As such this method is most appropriate for specialised application in relatively stable environments.

Third method is automated software modelling or sometimes called artificial intelligence modelling. Instead of hiring a team of mathematicians to build a model, several companies are developing software packages that are themselves capable of  choosing what factors are most important in modelling a particular environment. Most notable example of a company engaged in this area is Numenta. While this approach can be orders of magnitude cheaper, compared to traditional statistical approaches, its usefulness rests with high velocity temporal data applications, such as modelling electricity usage, credit card transactions or monitoring equipment status. This software is also capable to dynamically adapt to underlying changes in relationships within data.

Final words

As can be seen from the above list of solutions, the Big Data Challenge is a fragmented problem. Each particular situation demands careful problem classification and selecting appropriate tools to address it. I believe that these tools fall into the three categories described above and that each category is experiencing rapid evolution. The challenge facing many businesses today is navigating through this complex environment, and hopefully this article helps them to do so.

~Alexey Mitko

The Land of Hairless Carpets

March 19, 2013 Leave a comment

During my Bachelor studies my economics professor shared an interesting story, the lessons of which I’m just beginning to grasp. Back in early 1900’s, when vacuum cleaners were a recent invention, many vacuum cleaner producers were competing on the suction power of their devices. In the beginning, competition on the power dimension made perfect sense, after all, the higher suction power provided for better cleaning. Over time and as technology progressed vacuum cleaners grew more powerful and eventually became capable of tearing hairs out of the carpets they cleaned. Unfortunately, consumers didn’t know at what point vacuum cleaners become carpet barbers, thus Consumer Protection Agency had to step in and restrict how vacuum cleaners ought to be marketed.

What is the moral of the story? Initially important, but subsequently outdated competitive dimension may actually siphon your resources, which in turn could have been used for true research and innovation. The mistake of competing on irrelevant factor is often made when company loses the sight of its purpose. If in the example above the purpose of the company was to provide an easy and efficient cleaning tool, then vacuum’s power is an important factor, but to a point. As history has it, eventually dust bags and cyclone vacuums were created, and overall weight of vacuum cleaners was reduced as well. Product innovation cycles through competitive factors, companies that fail to recognise that end up in the land of hairless carpets.

These principles are not limited to vacuum cleaners! Similar cycles can be observed in the mobile phone industry. Every time a new smartphone comes out its hardware is carefully examined. Over the years, cpu power and RAM capacity were legitimate competitive dimensions. If your engineers were able to produce faster, lighter phones, without increasing power consumption, then the resulting improvements contributed directly to customer experience. The end product was more fluid and could boast better graphics experience. But these competitive dimensions have diminishing returns. What if human eye cannot tell the difference between a super definition display and ultra definition one? What if all smartphones on the market are capable of super smooth performance? After all, once response times become minuscule,  even orders of magnitude improvements become had to notice. It is quite possible that current smartphone race is reaching its hardware limits and companies that are not careful may miss the next competitive dimension.

~Alexey

Categories: Alex, Authors, Strategy

Ethical battle lines of Marketing

February 22, 2013 Leave a comment

Marketing has been a core component of business since probably a second after business was invented. At its core Marketing aims to deliver a message to a specific group of people and that message often tries to persuade those individuals to consider purchasing the product or service in question. Competition in the marketplace makes Marketing a vital business function. After all, even if your company makes the best product, but nobody knows about it, chances are that your competitors will be able to reap greater rewards. Under competitive pressure it is often essential for marketers to promote the product to its potential, but they cannot cross the line into misleading. In the US misleading advertisements are often termed “false advertisement” or “deceptive advertisement” and are regulated by the Federal  Trade Commission. Other countries have similar laws and regulatory bodies, some initiatives that aim to protect consumers are www.isitfair.eu in EU and Australian Competition & Consumer Commission in Australia.

It is clear that you can advertise in a deceptive way and that there are regulatory and consumer bodies set up to protect the world from such practices. But what exactly is deceptive? Over the course of history and with the help of the legislative system certain marketing practices became accepted as deceptive. For example, marketing one product and substituting it for something else, creating a pyramid scheme or forging trust marks. These examples are numerous, but they came about and became accepted as deceptive because injured parties sought compensation in court in the years past. But what if the matter is so minor and legal process so expensive that nobody bothers to seek compensation? Or what if deception cannot be easily proven? I would speculate that there remains a subset of marketing practices that could be called dubious, while regulatory bodies are too busy policing more serious cases.

Just how prevalent these practices are? How used are we to them? I will include several examples below, but feel free to add additional illustrations into the comments.

1. That food looks so good on TV, but not so well in real life. Actually there are companies that specialise in replicating popular menu items in plastic. One such company is www.fake-foods.com. Not all replicated foods are used in tv commercials, some are used as restaurant displays and children toys, but some  do end up being stars of 30-second movies. Using plastic props in commercials makes perfect sense, since they don’t spoil or wither with time and able to tolerate high-powered  stage lights without melting. But is that deceptive? After all, the food I buy at the store is not plastic and would look quite different on TV.

2. People that look good on TV and look well in real life, but never used the product advertised to achieve their results. I’ll let the actual ad prove my point. Is that deceptive? Those people are in great shape, but they probably achieved such physique either genetically or by going to the gym, an exact opposite of what ad claims it can do for you. Could it be that it’s just natural to get models for your commercial? And since everyone does it is would be against industry practice try to do otherwise. Besides consumers are aware enough to understand distinction between marketing and reality. To address these arguments, I would like you to have a look at this ad. Did those people used the actual product to achieve their results? Do you think they had hairdressers on the set when filming the commercial? Another example can be seen here.

3. Perfume is all about the smell right? Perfume ads are an interesting example. On one hand what is sold is a fragrance, a physical product, that has nothing to do with the model on the ad or the shape of the bottle. In fact, very few people would be able to connect the ad and the smell of the product. So is it deception to use pretty models to sell your product? Should you not include testers in every magazine ad? Not quite, while how perfume is advertised has little or nothing to do with the actual smell, something else is sold along with it. That something is value created by the add itself solely in the mind of the consumer. By looking at the model, the elegance of the bottle some consumers derive satisfaction because they are able to imagine themselves as belonging to that lifestyle image. The feeling that consumers buys along with the perfume is purely subjective, created only in his/her mind, but it is real and paid for.

Danger with some of these practices is that we become used to them and as the result transpose marketing reality into our actual lives. Any other examples?

~Alexey

How long is your attention span?

January 21, 2013 Leave a comment

Human brain has an interesting tendency, it strives to create explanations (or better word would be models) for the environment it observes. Evolutionary speaking, this trait is highly advantageous, it allows humans to adapt and predict their environment, thus increasing chances of survival. But it also creates false positives, builds models and finds order where none exist, as for example in creating constellations out of a random spray of stars in the night sky. While the night sky is an example of how humans find order in dimensional world, we should also note that false positives may exist in the temporal continuum.

Consider, for example, a recent graduate, who becomes a young aspiring commodities trader circa 2000. It would not take long for her to learn that her peers who advise their clients to buy gold are getting substantially bigger paychecks than she does. So she asks her coworkers for their strategies, their models and explanations for how gold behaves. Wether or not those explanations are correct does not actually matter, as long as they predict higher gold price and do not substantially deviate from industry thinking, none of them can be proven wrong. Fast forward to 2012, our graduate is now a 32-year-old successful senior manager, who built her career in gold trading, and received hefty bonuses for the past 10 years. The reason for her success is her unyielding faith in the value of the yellow metal, her peers who doubted that the gold rise in the past are working for her now, since they were not promoted as quickly.

What should interest us in this discussion is the gold pricing model that this manager developed in her head over the years. It can be as simple as a single sentence, or a complex mathematical model, but the key question we must ask is whether or not it is biased. Is it biased? After all this manager was rewarded for a very particular behaviour for the past 10 years, her peers were punished for predicting lower gold prices, her whole career was based on 10 years of rising gold prices, the echelon of her analysts, senior and junior, have similar behaviour cultivated in them since the first day they join the firm. Are they biased? Would they not try to look optimistically on any kind of evidence presented to them, not try to find an explanation for why the gold should rise again after a dip, just as it did before? Furthermore, this slightly skewed view propagates through the ranks with each one adding an additional twist of optimism until reality becomes grossly distorted. How can a person, in a position of power, be objective when all he/she was presented with during her career is a single side of the coin? Can human attention span be long enough to incorporate macroeconomic forces, which take decades to play out, into mental models that humans so eagerly construct?

For most of us gold trading is of little relevance, but principles at play apply to situations that rest much closer to home. Consider for example a banker, who is measured on his yearly performance, while the loans that he makes are for periods longer than 10 years. Would he care if the loan is defaulted on 7 years from now? Probably not, the loan book could have been sold on or he could have changed jobs several times since then. You can realize how this sort of behaviour could quickly become a reckless one. If banking is not personal enough for you, consider a manager who is responsible for plant maintenance. He is presented with a choice, to save some money now or spend it to upgrade plant’s equipment, if he saves it now then he is rewarded, albeit at the risk that the machine will have a higher chance of a breakdown (but when? 1,2 or maybe 3 years from now?). How long is his attention span?

In conclusion I would like to share several quotes that drive home some of the principles discussed.

Abraham Maslow “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”

Ralph Linton “The last thing a fish would ever notice would be water.”

Daniel Day-Lewis  “Perhaps I’m particularly serious, because I’m not unaware of the potential absurdity of what I’m doing.”

Several movies: Anatomy of a DisasterHow the banks never lose

And several interesting articles: Morgan Stanley, Warning to Banks

Enjoy!

~Alexey


gold price charts provided by goldprice.org

The Future of Internal Audit – Automated Predictive Controls

November 16, 2012 Leave a comment

The increasing quantity of data produced by today’s businesses is old news (1,2,3), in fact we have written about it previously in our Power of Data post. The resulting shortage of individuals with skills and ingenuity to analyze and interpret all the data generated is also widely discussed (The Rapid Growth of Global Data).

So far the Big Data challenge has been addressed by familiar means. Importance of statistical analysis, open data initiatives, development of robust and unstructured databases, creation of integration methodologies, improvements in well-known statistical software (SAS, SPSS, MatLab) and design of new ones, all these instruments place human intelligence as the key to unlocking value stored in the data.

 But what do humans actually do during data analysis?

In my opinion, what we name as “valuable analysis” is nothing more than a process of building a predictive model. By looking at the data and passing it through statistical tests, we hope to discover relationships among different inputs, to enable us to predict and plan our future actions. For example, we want to know how much of the weekly demand for our product can be attributed to our marketing campaign and how much of it can be attributed to the random lack or factors outside of our control. Alternatively, we might want to know: On what occasions do people use Twitter (or any application X)? What do they talk about? Is the comment positive or negative? How does it affect their driving habits for the day? As you can see, the analysis is a process of collecting data or transposing it into a form usable by statistical software, running tests to find relationships, and creating models that would help us to make decisions. So if our advertisement investment did not produce the results we were hoping for, then we will redirect the investment to alternative projects.

Why do we need human intelligence for data analysis?

The main challenge that computers face in their ability to take on data analysis tasks lies in the fact that a lot of the data is unstructured, comes in a variety of forms, and conceals dynamic relationships. Statisticians, if you will, use their intuition to narrow in on important relationships, able to change the form of data to suit their purposes, and draw on their experience and knowledge, relating one piece of data to another. Computers are not able to do that, they have no purpose when they crunch the data, nor they are able to intuitively know that the amount of hens is directly proportional to the amount of eggs produced.

Will it stay like this?

Technology has been making great strides in enabling machines to decode new forms on information. Online translators are able to convey general meaning of foreign texts with ever-increasing literary prose, computer-enabled telephone support services have become common place (although most of us probably still prefer to shout “customer representative”), text recognition is more or less perfected, picture recognition is the next frontier. Although we will not see drastic changes tomorrow morning, over the span of the next 10-15 years we might expect that some of the data analysis tasks that we associate with human intelligence will migrate into the domain of computer ability.

How does it affect Internal Audit?

In Internal Audit (IA), two main terms are related to the aforementioned developments, they are “technology enabled audits” and “automated controls”.  A snapshot about what automated controls are can be found here (Protiviti- Automated Controls). The majority of automated controls come hand in hand with Enterprise Resource Planning (ERP) packages – expensive, enterprise-wide information systems. Since these ERP systems collect a vast amount of information, internal audit professionals can set acceptable variance limits on a range of input variables. A key point here is that IA-professionals are actually the ones determining what important relationships are, much like in the data analysis process already discussed.

How will it change?

I believe that a key function of internal audit departments in the future will be to maintain predictive automated controls and investigate problems flagged by the system. For example, imagine that we are in a trading business. Each employee, has a keycard to enter the building and a password to log on to the work computer. Now imagine that the computer was logged in, but the keycard was not swiped. Furthermore, a large transaction was placed from the computer. Quite suspicious, isn’t it? On one hand, it is a possible fraud, on the other hand maybe the employee innocuously forgot to swipe his card. The supervisor can be alerted and investigate which one of the two it is. Now, it is essential to understand the distinction between the two ways in which we can set up this kind of automated control. On one hand we can hard code everything, IA staff would essentially create a rule that says “If no card was swiped and the computer logged in, then alert supervisor”. In my opinion, this is a highly inefficient and rigid way of doing it. Alternatively, we can implement a system that monitors streams of data, i.e. card swipes and logins, and allows the system itself to build its own rules for what is “normal”. Under this scenario, the system would see that data stream 1 has an input (card swipe), followed by an input in data stream 2 (computer log in), followed by many inputs of varying degree from data stream 3  (transactions). This pattern repeats day after day, until one day data stream 1 produced no record, data stream 2 still occurred, data stream 3 was abnormal, if several streams produce abnormal results then the system contracts the supervisor for investigation. In this simple example only 3 data steams were used, but we can conceivably add computer ip addresses, employees work phone’s gps locations, etc. With additional streams of collaborative data, the system could be trained to predict potentially hazardous situations more accurately. And, without the need to reconfigure the system to each specific situation, it is possible for IA to delegate the task of building predictive models to computers, while concentrating on the investigation of anomalies.

When will such systems be built? How to build them?

It’s an interesting cross-disciplinary topic, that combines aspects of IT, such as machine learning and data storage, Internal Audit, such as control environment and technology risks, and human psychology.  I would be very interested to know your thoughts and insights!

~Alexey

People or Process Reliant

November 2, 2012 Leave a comment

During my comparatively short career I had the chance to work in a start-up business, a medium size enterprise and a large corporation. Needless to say that there are many obvious differences among the three, but one particular aspect that I would like to draw our attention to today is the process of enterprise growth.

At first we need to ask ourselves what the thing that we call an organisation is. Unfortunately, the sheer volume of information related to organisations makes defining its essence a daunting task, even with Google’s help. For the purpose of this article we will construct our own definition, which will draw upon Culture Management and Knowledge Management concepts:

An organisation is a set of behavioural patterns, which are influenced by individual employees, codified norms and historical behavioural patterns.

Four important concepts are tied together in this definition:

Firstly, we assume that it is through behaviour of its employees that an organisation exhibits its presence.  It’s common to see definitions that emphasise on the culture or brand as the essence of the organisation. However, these definitions are ill-suited for our purpose because they don’t go far enough. While causes that elicit employee behaviour are multiple, our proposition is that behaviour is the ultimate variable affected. Culture is important, it affects behaviour,  but so does a policy manual.

Subsequently, we propose three sources of influence that do affect behaviour: individual employees, codified norms and historical behavioural patterns. Individual employees bring their own experiences, habits and initiative into the organisation. Whether or not their habits become part of the organisation depends on a variety of factors, which will be discussed later. Codified norms are another source of influence and refer to actual procedures set out in various policy manuals.

Lastly, historical behavioural patterns consist of how the organisation and people in it traditionally behaved (this portion often described as culture) and significant “out of norm occurrences”. For example, let’s say fraud occurred in the organisation and people in the organisation were so shocked as to become extra vigilant, then the organisational culture would be “anti-fraud”, yet past fraudulent behaviour had significant influence.

Once our framework for analysis is set up, we can ask ourselves how a young company differs from a mature one.

In my experience, young companies are (surprise!) people driven. With little codified norms and virtually no historical behavioural patterns to restrict individual creativity, employees and especially founders are able to set a foundation, good or bad, for organisational behaviour. Mature organisations, on the other hand, have processes and policies in place to restrict influence of individuals. Specific employees are not loger able to do what they like, but need to meet minimum explicit or implicit performance standards, or are restricted in the ways they can perform their functions. On one hand, such development is often critiqued since large organisations are notoriously slow in adapting to change, on the other hand, by codifying behavioural norms, managers can objectively evaluate and adjust organisational behavioural patterns.   The most peculiar situation happens in growing organisations, where the need for explicit standards and instructions is clearly visible due to increased efforts spent by employees in coordinating activities, yet each position within the organisation is attached to a particular individual and presents itself to managers as a vague black box, “we sort of know what this employee does, but aren’t sure how he does it”.

Anyone wishes to share their experiences in start-up, growing, or mature organisations?

Alex

Starting your business is someone else’s business

August 18, 2012 Leave a comment

The title of this blog post is a paradox. On one hand, trying to start a business is a personal adventure, marked by high aspirations and a notable lack of funds. On the other hand, some entrepreneurs figured out that helping other people to start their business can be a business in itself. And that fact adds complexity to the whole process. Our would be entrepreneurs have to distinguish between people who offer genuine help or good value for the money they ask and people such as themselves, who are just starting their business and at the moment cannot truly help would-be entrepreneurs.

For example, you find a person who was able to start a somewhat successful startup and pay them $300 to speak  your event (or even better: get them to do it for free to promote their business), you rent a room for 3 hours ($300), arrange catering ($200), advertise ($200) and sell tickets to 40 wannabe- entrepreneurs for $50. The profit is $2000-$1000=$1000. Considering young entrepreneurs are very eager to chase their dreams, and are more than willing to pay $50 for a promise of networking and “startup tips”. But the real question is do you get value from your money? Or are you draining your start-up funds and waste the most important resource: your time?

I would argue that an entrepreneur should pay only for tangible services like legal services, marketing services, or technical expertise. Networking clubs and startup tips can be found online and for free. The mere fact that someone fills the room with entrepreneurs does not mean that this event will give you the solution to your business, most likely it will leave you $50 short and wanting to pay for other events of this nature. The point I try to get across is that other people made it their business to sell services to people who seek to start their business. The lesson that many entrepreneurs forget is that some people are not trying to help you, but to make money from you. But as an entrepreneur you must be frugal, since your resources are very limited.

Check this website for more startup tips http://frugalentrepreneur.com/

Source: Cartoonstock.com

Alex