Archive

Archive for November, 2012

Project Management in 311 words

November 23, 2012 Leave a comment

If a graduate is highly effective in his/her current operational role, will he/she be promoted to a managerial position? Occasionally it does happen, but it is important to note that a managerial position requires a completely different set of skills, most important of which is project management.

Project Management as a science is an attempt to break down the process of “making ideas come to life” into manageable sub processes. Project Management as an art is a challenge in unraveling interconnected issues in reality and taking appropriate actions in an environment of limited information. The Science of Project management is described in PMBOK (Project Management Body of Knowledge), while the Art part comes with experience.

There are two key concepts in project management that I find to be fundamental in many other areas of the business. One of them is the “Plan-Execute-Control-Improve” cycle, which is helpful in reminding us to think our action through before we act, compare our performance to our plan, and to learn from deviations that occur. If any part of this cycle is dropped, then the quality of your product will suffer, whether the product is an actual project or another intellectual endeavor.

The second concept, which is more directly applicable to project management, is an interconnection between the project scope, project cost, and project time. If you plan to cover more ground on your project, you must expect the cost and/or time to increase. Likewise, if you wish to reduce the cost, then you must expect the amount of pace of work to decrease, and so on. However, the relationship is frequently exponential rather than linear, so if you want the project to be done twice as fast, then you are likely to pay more than twice the cost.

What are you experiences in running or being a part of a project?

~Alex

Categories: Uncategorized

The Future of Internal Audit – Automated Predictive Controls

November 16, 2012 Leave a comment

The increasing quantity of data produced by today’s businesses is old news (1,2,3), in fact we have written about it previously in our Power of Data post. The resulting shortage of individuals with skills and ingenuity to analyze and interpret all the data generated is also widely discussed (The Rapid Growth of Global Data).

So far the Big Data challenge has been addressed by familiar means. Importance of statistical analysis, open data initiatives, development of robust and unstructured databases, creation of integration methodologies, improvements in well-known statistical software (SAS, SPSS, MatLab) and design of new ones, all these instruments place human intelligence as the key to unlocking value stored in the data.

 But what do humans actually do during data analysis?

In my opinion, what we name as “valuable analysis” is nothing more than a process of building a predictive model. By looking at the data and passing it through statistical tests, we hope to discover relationships among different inputs, to enable us to predict and plan our future actions. For example, we want to know how much of the weekly demand for our product can be attributed to our marketing campaign and how much of it can be attributed to the random lack or factors outside of our control. Alternatively, we might want to know: On what occasions do people use Twitter (or any application X)? What do they talk about? Is the comment positive or negative? How does it affect their driving habits for the day? As you can see, the analysis is a process of collecting data or transposing it into a form usable by statistical software, running tests to find relationships, and creating models that would help us to make decisions. So if our advertisement investment did not produce the results we were hoping for, then we will redirect the investment to alternative projects.

Why do we need human intelligence for data analysis?

The main challenge that computers face in their ability to take on data analysis tasks lies in the fact that a lot of the data is unstructured, comes in a variety of forms, and conceals dynamic relationships. Statisticians, if you will, use their intuition to narrow in on important relationships, able to change the form of data to suit their purposes, and draw on their experience and knowledge, relating one piece of data to another. Computers are not able to do that, they have no purpose when they crunch the data, nor they are able to intuitively know that the amount of hens is directly proportional to the amount of eggs produced.

Will it stay like this?

Technology has been making great strides in enabling machines to decode new forms on information. Online translators are able to convey general meaning of foreign texts with ever-increasing literary prose, computer-enabled telephone support services have become common place (although most of us probably still prefer to shout “customer representative”), text recognition is more or less perfected, picture recognition is the next frontier. Although we will not see drastic changes tomorrow morning, over the span of the next 10-15 years we might expect that some of the data analysis tasks that we associate with human intelligence will migrate into the domain of computer ability.

How does it affect Internal Audit?

In Internal Audit (IA), two main terms are related to the aforementioned developments, they are “technology enabled audits” and “automated controls”.  A snapshot about what automated controls are can be found here (Protiviti- Automated Controls). The majority of automated controls come hand in hand with Enterprise Resource Planning (ERP) packages – expensive, enterprise-wide information systems. Since these ERP systems collect a vast amount of information, internal audit professionals can set acceptable variance limits on a range of input variables. A key point here is that IA-professionals are actually the ones determining what important relationships are, much like in the data analysis process already discussed.

How will it change?

I believe that a key function of internal audit departments in the future will be to maintain predictive automated controls and investigate problems flagged by the system. For example, imagine that we are in a trading business. Each employee, has a keycard to enter the building and a password to log on to the work computer. Now imagine that the computer was logged in, but the keycard was not swiped. Furthermore, a large transaction was placed from the computer. Quite suspicious, isn’t it? On one hand, it is a possible fraud, on the other hand maybe the employee innocuously forgot to swipe his card. The supervisor can be alerted and investigate which one of the two it is. Now, it is essential to understand the distinction between the two ways in which we can set up this kind of automated control. On one hand we can hard code everything, IA staff would essentially create a rule that says “If no card was swiped and the computer logged in, then alert supervisor”. In my opinion, this is a highly inefficient and rigid way of doing it. Alternatively, we can implement a system that monitors streams of data, i.e. card swipes and logins, and allows the system itself to build its own rules for what is “normal”. Under this scenario, the system would see that data stream 1 has an input (card swipe), followed by an input in data stream 2 (computer log in), followed by many inputs of varying degree from data stream 3  (transactions). This pattern repeats day after day, until one day data stream 1 produced no record, data stream 2 still occurred, data stream 3 was abnormal, if several streams produce abnormal results then the system contracts the supervisor for investigation. In this simple example only 3 data steams were used, but we can conceivably add computer ip addresses, employees work phone’s gps locations, etc. With additional streams of collaborative data, the system could be trained to predict potentially hazardous situations more accurately. And, without the need to reconfigure the system to each specific situation, it is possible for IA to delegate the task of building predictive models to computers, while concentrating on the investigation of anomalies.

When will such systems be built? How to build them?

It’s an interesting cross-disciplinary topic, that combines aspects of IT, such as machine learning and data storage, Internal Audit, such as control environment and technology risks, and human psychology.  I would be very interested to know your thoughts and insights!

~Alexey

Follow your passion, make it a movement

November 12, 2012 Leave a comment

If you have an idea that you really like and that you are really passionate about, you should follow it. No matter if it is a business idea or a non-profit idea. If it is about a niche-market or about to fundamentally change the world…

There will be hurdles in the way of executing your genius eventually. But you shouldn’t resign at the first sign of headwind.

A very good example is the story of the inventors of Movember. Following up from a dare in a bar in 2003, they created a charity event that raised a total of $126 million worldwide for prostate cancer last year.

Check it out: http://www.ted.com/talks/adam_garone_healthier_men_one_moustache_at_a_time.html

Categories: Uncategorized Tags:

People or Process Reliant

November 2, 2012 Leave a comment

During my comparatively short career I had the chance to work in a start-up business, a medium size enterprise and a large corporation. Needless to say that there are many obvious differences among the three, but one particular aspect that I would like to draw our attention to today is the process of enterprise growth.

At first we need to ask ourselves what the thing that we call an organisation is. Unfortunately, the sheer volume of information related to organisations makes defining its essence a daunting task, even with Google’s help. For the purpose of this article we will construct our own definition, which will draw upon Culture Management and Knowledge Management concepts:

An organisation is a set of behavioural patterns, which are influenced by individual employees, codified norms and historical behavioural patterns.

Four important concepts are tied together in this definition:

Firstly, we assume that it is through behaviour of its employees that an organisation exhibits its presence.  It’s common to see definitions that emphasise on the culture or brand as the essence of the organisation. However, these definitions are ill-suited for our purpose because they don’t go far enough. While causes that elicit employee behaviour are multiple, our proposition is that behaviour is the ultimate variable affected. Culture is important, it affects behaviour,  but so does a policy manual.

Subsequently, we propose three sources of influence that do affect behaviour: individual employees, codified norms and historical behavioural patterns. Individual employees bring their own experiences, habits and initiative into the organisation. Whether or not their habits become part of the organisation depends on a variety of factors, which will be discussed later. Codified norms are another source of influence and refer to actual procedures set out in various policy manuals.

Lastly, historical behavioural patterns consist of how the organisation and people in it traditionally behaved (this portion often described as culture) and significant “out of norm occurrences”. For example, let’s say fraud occurred in the organisation and people in the organisation were so shocked as to become extra vigilant, then the organisational culture would be “anti-fraud”, yet past fraudulent behaviour had significant influence.

Once our framework for analysis is set up, we can ask ourselves how a young company differs from a mature one.

In my experience, young companies are (surprise!) people driven. With little codified norms and virtually no historical behavioural patterns to restrict individual creativity, employees and especially founders are able to set a foundation, good or bad, for organisational behaviour. Mature organisations, on the other hand, have processes and policies in place to restrict influence of individuals. Specific employees are not loger able to do what they like, but need to meet minimum explicit or implicit performance standards, or are restricted in the ways they can perform their functions. On one hand, such development is often critiqued since large organisations are notoriously slow in adapting to change, on the other hand, by codifying behavioural norms, managers can objectively evaluate and adjust organisational behavioural patterns.   The most peculiar situation happens in growing organisations, where the need for explicit standards and instructions is clearly visible due to increased efforts spent by employees in coordinating activities, yet each position within the organisation is attached to a particular individual and presents itself to managers as a vague black box, “we sort of know what this employee does, but aren’t sure how he does it”.

Anyone wishes to share their experiences in start-up, growing, or mature organisations?

Alex