Sign Out
Logged In:
 
 
 
 
 

Archive: 2014 (Features)


Monday, 27 Oct 2014
Cara Quinton

One of us (Sanja Petrovic) recently joined a Division of Operations Management and Information Systems in a Business School as Professor of Operational Research whilst the other (Bart MacCarthy) has been a Professor of Operations Management in the same Division for over a decade. To many of our colleagues in the Business School we occupy the same territory. We do indeed have some strong areas of overlap in our research interests but quite a number of differences in emphasis and approach also. A natural question emerges – what is   the relationship between Operational Research (OR) and Operations Management (OM)?

 Many in the OR community would say they are more or less the same. However, the reverse is not true. Many in the OM community do not perceive themselves as OR researchers or practitioners. It seems that in both the scientific literature and business communities there are no definitive answers on the scope of, and differences between OR and OM. 

This is not a new question of course. The histories of the disciplines are intertwined. Many of the classical problems in OR (e.g. the job shop scheduling problem, the travelling salesman problem, queuing and inventory models) are directly relevant to the design and management of operational systems. Many researchers across the OR and OM disciplines publish in the same academic journals but then again there are OR journals that some OM academics would not consider at all relevant and vice versa. There are quite eminent scientists who earnestly believe that OR and OM are synonymous. Others, equally eminent, resolutely maintain that OR is focussed on mathematical modelling, optimisation problems and simulation, while OM is a discipline addressing more general problems about processes and systems and involving a broader range of issues including information systems, organizational and human behaviour, ethics and other softer management disciplines. Valid observations may be that there is no clear and crisp distinction between the disciplines and there is considerable overlap. 

In text books and encyclopaedias ORis typically referred to as a discipline that deals with the application of advanced analytical methods to help make better decisions, whilst OM addresses the activities, decisions and responsibilities of managing the design, production and delivery of goods and services. There is general agreement that OR concerns the application of quantitative models and methods to understand and ‘solve’ many types of problems, including those that have a strong operational focus. This means OR involves the application of existing methods and the development of new ones in many mathematically defined operations areas. In  OR manuscripts one can find key words like optimisation, dynamic control, Markov chains, stochastic analysis, games, risk analysis, etc - but of course one can see some of these terms in some papers in OM journals also. However, there is also more and more recognition of Soft OR approaches, applying predominantly qualitative techniques with the aim to define and explore a given problem from various perspectives. 

Similarly, there is agreement that OM is concerned with the creation, design, production and delivery of products and/or services. However OM places strong emphasis on understanding ‘effectiveness’ in systems design and management, which may not be evident purely from mathematical modelling. Effective deployment and effective management and practice necessarily bring in broader human, organisational and systems issues. In OM manuscripts key words could include performance measurement, project management, supply chain management, manufacturing and production, energy/ transportation, service systems etc – but, as above, many of these terms appear in papers in OR journals also.  Additionally, there is increasing recognition in OM that appropriate mathematical modelling can lead to valuable insights into the design and management of robust operational systems, particularly in large complex systems.   

Some would argue that OR journals have prioritised manuscripts that describe methods and algorithms for solving usually well-defined theoretical problems but somewhat divorced from real-world applications. The developed models are usually a simplified representation of a real-world problem. The focus is then on rigorous evaluation of a method or algorithm and on performance comparisons with other methods and algorithms reported in the literature. The OM literature publishes both qualitative and quantitative research, perhaps with a greater emphasis on the former. The argument is that the OM literature has placed greater emphasis on real-world problems, including contemporary concepts such as lean thinking, sustainability, and globalisation. 

An interesting illustration is the domain of forecasting. The OR literature has tended to emphasise the development of algorithms and techniques for the generation of ‘accurate’ forecasts from historical data sets. The OM literature has been more concerned with forecasting as a process within organisations and its incorporation into effective planning and management regimes. Of course, one cannot live without the other. In large scale retail organisations for instance we rely on timely and accurate model-based forecasts for perhaps many hundreds of product lines across retail stores. But equally we rely on strong operational processes for the translation of such forecasts into appropriate ordering and replenishment decisions and the effective management of logistics and distribution processes to ensure on-shelf availability.            

We agree that this brief article raises more questions than answers. It has not sought to be exhaustive in discussing the foci and boundaries of the OR and OM disciplines. There is a little surprise that some have labelled OR as the more theoretical and OM the more applied discipline. Is it true? Probably not! Many counter-examples could be given but there are perhaps elements of truth in these observations. 

We conclude by noting that irrespective of the borders between the two disciplines, they should each seek to address the many challenges that arise in modern complex decision making environments and work collaboratively where appropriate. Let’s not argue too much about fuzzy discipline boundaries but instead place the emphasis on relevance and rigour in addressing real world problems.

Sanja Petrovic and Bart MacCarthy

 Nottingham University Business School



Monday, 8 Sep 2014
Cara Quinton

One of the distinguished guest speakers at this year’s Developments in Advanced Analytics and Big Data event held at the BMA in London on April 30th was Sir Mark Walport, the Government's Chief Science Adviser. 

The topic of his talk was: “The age of analytics: how governments can make the most of the data opportunity." He began by saying he came from the world of scientific research and funding science, and was aware that questions can generate data and data can generate questions.  Equally, he believed there was a case for both scientists and Research Councils asking questions.

He also said the interesting “thing you come across in government” was the stark divide so often found between analysis and policy making, though it seemed to him these were two sides to the same coin. “How can you do analysis if you don’t know what the question is and equally how can you make policy if you don’t understand something about the analytic methods”. He also said that one of the things he had inherited when he became the Government Science Adviser was the “horizon scanning foresight” that had been done for the government for many years of which the “Blackett reviews” formed an important part.

 Blackett had advocated having scientists in close touch with [naval] operational staff to provide them with a quantitative basis for decision making. The resultant success of putting such mechanisms in place during the war, not only helped us to win it, but also placed strong emphasis on the value for government of obtaining scientific advice. He added “That’s a pretty good job description for what the Government Science Officer hopes to do”.

 Sir Mark thought in general that data collected by government agencies (such as via medical studies) should be freely available to researchers provided privacy and confidentiality are taken into account. He then went on to talk about a Foresight Project regarding what cities will look like in 2065. “Of course”, he said, “everyone talks about Smart cities and I don’t think anyone wants to live in a dumb city. How does one turn that morass of [big data] into useful information for running a city in real time and into planning on a longer time scale, how do we use that information to plan for liveability, planning tends to be done in a very fragmented way?”

The answer of course lay in the application of Operational Research which could make sense of the fact that cities had by and large developed in unstructured ways. He thought that those of us interested and involved in Operational Research could “perhaps do something better” about planning future cities by “applying the power of analysis” and deriving insight from unstructured data.



Thursday, 10 Jul 2014
Cara Quinton

Since Donald Rumsfield explained why invading Iraq was not a stupid idea, we have all been familiar with the notion of ‘unknown unknowns’. But unless you are a cutting edge life scientist, there is really no such thing. Sure, there are lots of things each one of us doesn’t know that we don’t know. But somebody knows them. And that is why networking is such a wonderful thing.

Are you planning to migrate to a software platform you’ve never worked with before? You can ask questions of the distributors – but there may be all sorts of things you don’t know to ask. But if you talk to somebody who has used it or, better, has recently migrated to it, you can learn all sorts of things you didn’t know you didn’t know. The same goes for new techniques, new application areas, jobs in new fields…without talking to people who’ve been there themselves, you are missing out on one of the most vital sources (and in some case, the only possible source) of information.

Networking as information exchange is not only essential to developing good professional practice, it is also an activity where we can all be givers. Generosity with one’s own knowledge is the mark of a good professional.

Many people prefer to build their networks through serendipitous encounters. They are put off systematic networking by the idea that networking leads to ‘using’ people for your own ends, or that it is to help the sharp-elbowed gain advantage. So it can; but it can be so much more universal, and reciprocal than that. What’s more, it can also be fun.

The speed networking session at the ‘Making an Impact’ practitioners’ stream at OR56 is the perfect occasion to see how this works. It is designed so that even the shyest of us can join in without embarrassment, and the outcome is an immediate boost to the number of people you may be able to turn to in the future – or who may be able to turn to you.

What makes the speed networking so exciting is the chance to let people know enough about you in a very limited time and evoke their interest. One of the ways to do this is to have an elevator pitch ready.

Whether you are an analyst trying to pitch your idea, or a consultant trying to land another piece of work, a head of department constantly trying to get your budget approved or increased or an O.R. professional looking for people you could learn from or who could learn from you – what would you say if you meet your CEO, a client you have been yearning to work with, or an expert in the very field you are trying to study?

So imagine you are in an elevator, and have 30 to 60 seconds to leave an impression by providing enough information to be invited for the next conversation. Start with a “pain statement” i.e. a problem that you are trying to solve. Next, state what your value proposition is and how what you do solves that problem. Lastly, be clear on what you are looking for. Keep it short. Have a hook. Pitch yourself, not only your ideas. Practise.

Have your perfect elevator pitch ready for to gain new insights, expand your professional network, catch up with O.R. colleagues and last, but definitely not least, to have fun!

RUTH KAUFMAN AND RAMUNE GEDGAUDAITE



Tuesday, 29 Apr 2014
Cara Quinton

Nigel Cummings

“Data scientists are the new rock stars” according to the likes of Olaf Swantee, CEO EE, Ken Rudin, head of analytics at Facebook and Jeff Magnusson,  manager of data science platform architecture at Netflix.

These gentlemen believe the data sciences are now so popular that if you ask children what they aspire to be as adults, data scientist will be a choice mentioned in the same breath as fireman, doctor, rock star, rapper, or even astronaut. This is because these days being a data scientist is seen as being someone involved in a glamorous industry. 

Just looking at the United States alone for a moment, there are almost 190,000 positions available for up and coming data scientists.  Companies looking for success in the data sciences are taking in all sorts of science graduates; Bachelors, Masters and even Doctors. Stanford, North Carolina State and Northwestern universities are already experiencing a huge influx of students clamouring for degrees in data management and analysis. A similar situation exists in Europe and applications for data science placements are “on the up” in the UK – you only have to attend one of our careers open days to see that!

A study by the Royal Academy of Engineering shows that British industry will need 1.25 million new STEM graduates between now and 2020 just to maintain current employment numbers. Even that figure might not be enough to satisfy the British data industries’ requirements though.

This is because is “big data” is as yet unquantifiable, just how big is it? A study at the end of 2012 by IDC predicted the “digital universe” would reach 40 zettabytes (ZB) capacity by 2020, though in reality that figure could be much higher – 40 ZB is 4*1022 bytes or approx, 40*270 bytes)

This terrific surge of data is being created by many external forces which include: financial transactions, mobile phones and social media, the number of clicks that take place daily on the Internet to access information, and even from the updating and keeping of medical records.

According to IDC only 1% of the world’s data is currently being analysed and the technology and tools for collecting and storing information has to date raced far ahead of our skills to understand it -data collection is outstripping our abilities to develop technologies to analyse it! Filling this “big data gap” is not just a question of “getting up to speed” with technologies though, filling the gap also means importing “new talent” into data driven employment - these people will be in demand as much as the software developers of the dot-com boom were.

Eric Siegel, author of Predictive Analytics, summed up the value of data when he said. “A user’s data can be purchased for about half a cent, but the average user’s value to the Internet advertising ecosystem is estimated at $1,200 per year.”

Data scientists Reid Hoffman and Konstantin Guericke created LinkedIn, in December 2002 to help build individuals’ networks for them.  The “people you may know” feature has, through its ability to target individuals for marketing purposes, raised the “value” of the company to approximately $7.5 billion (£4.5 billion). LinkedIn is one company that shows analytics can be a route to massive revenues!



Wednesday, 19 Mar 2014
Gavin Blackett

The Board of the OR Society is considering the possibility of becoming licensed to award Chartered Scientist status. What is this all about, and is it worth doing?

In 2013 the ORS became a member body of the Science Council, a membership organisation bringing together learned societies and professional bodies across science and its applications. The Science Council aims to promote the advancement and dissemination of knowledge and education in science for public benefit. It currently has around 40 members, including the Institute of Physics, the Royal Statistical Society, and the IMA. 

The Science Council defines science as “the pursuit and application of knowledge and understanding of the natural and social world following a systematic methodology based on evidence” – a definition one feels would be sufficiently broad to encompass O.R.  

Some years ago, the Council developed the Chartered Scientist (CSci) designation, “recognising high levels of professionalism and competence in science”.  Their intention was to benefit the public, by having a badge of competence and professionalism that could be trusted; to benefit the profession, by encouraging networking, promoting continuing professional development (CPD) and benchmarking professional competence levels; to benefit employers similarly; and to benefit individuals by providing a qualification which could be recognised outside the specific discipline or sector, and demonstrate professionalism and commitment.

Although the OR Society is a member body, it is not licensed to award CSci to its members – this would require a considerable investment in both time and money.

Chartership, accreditation...what’s the difference?

When the OR Society brought in its accreditation system many years ago, it was an enormously controversial issue. There was a significant number who saw this as pinning down and ossifying an activity which should be continually developing and growing, adapting as necessary to novel issues and circumstances, and which is in many ways an art or craft as much as a science. No doubt there will be many who will argue that O.R. is not strictly a science and to label it as such might in some way diminish it.

 

CSci focuses on skills and behaviours: for example, “Exercise self-direction and originality in solving problems, and exercise substantial personal autonomy in planning and implementing tasks at a professional level”. Indeed, although the language is different, the requirements of CSci are very similar to those of accreditation at around the AFORS level.

There are two main differences between the awards:

- CSci has more demanding CPD requirements. Evidence of CPD in the previous two years is necessary as part of the application; and a Chartered Scientist must commit to on-going CPD and be ready to present a record of this when required;

- a CSci must comply with a professional code of conduct.

Is it worth doing?

Accreditation has not proved to be as popular as was originally expected.  Obviously there are many reasons for this and one of them may well be that it does not automatically leaded to chartered status.  Employers in the UK seem reluctant to insist on professional accreditation although one suspects few would consider it negatively.  

Would CSci be any different?  Probably.  CSci, along with the other charters, are well-recognized internationally.  Having members who carry an externally recognised badge may help raise the profile of O.R. professionals in their organisations. As to how many more members this would attract, it is very difficult to say.  Again, it is unlikely to be seen in a negative light by those looking to join a professional organization but this may not be the case for those who are already members.  

It would be major undertaking for the Society requiring a substantial investment in both time and money so it is essential that you tell us what you think.  Would you be for or against and, if so, why?

Please send comments to Gavin Blackett, gavin.blackett@theorsociety.com 

For more information about the Science Council go to www.sciencecouncil.org; for more information about Chartered Scientist go to www.charteredscientist.org

 



 Archive 

  2017 (13)
  September (1)
  2016 (14)
  2015 (8)
  2014 (5)