When you walk down the High Street if you were to look at labels inside many of the clothes you buy you will see China, Bangladesh and Turkey prominent among others. Between them these three countries supply about 50 per cent of your clothing on the UK High Street, if not more. Prices have been low for the past decade and many retail brands have made substantial profits by sourcing product from these countries. Wage costs have increased in China and put pressure to increase prices charged to European and US Retailers. China itself has sought ways to lower its own cost base by managing complex supply chains with other countries such as Myanmar, Cambodia and Vietnam where labour costs are lower. Bangladesh too has pushed up wage rates although they remain very low. Turkey pays higher wage rates than either of these two countries already but its quality and proximity to Europe with faster supply times keeps it competitive.
However, now it is not just labour costs but worsening exchange rates on imports that will impact the cost and of course push up the prices you pay at the checkout together with inflationary pressure. In future as BREXIT becomes a reality new tariffs may kick in too. The inflation rate measured by the Consumer price Index (CPI) stood at 0.6 per cent in the year to August 2016. The inflation rate for clothing was negative at 1.2 per cent. Is this about to change too, as China, Bangladesh and Turkey seek higher prices for their goods and the exchange rate against world currencies falls?
Figure 1 Consumer Price Index (CPI)
Source: CSO, 2016
As for the GB Pound against the Chinese Yuan that has fallen to 8.21 from a high of 10 just over a year ago. The Bangladesh Taka has fallen from 120 to the pound to 95.5 in the year to September 2016. The Turkish Lira has also fallen from a high of 4.70 in September 2015 to 3.78 in September 2016. These shifts alone will push up UK cost of clothing from these supplier countries.
The principles of doing good research apply whatever the media. Develop clear research objectives to achieve your aim, assemble appropriate research methods and weigh up cost, time and quality in your research design. It is claimed that use of social media has made research easier to do. In many respects this is so. Research can be done faster and cheaper. Nevertheless, you cannot simply use it for everything. Think about why you want to use it and be aware it will not answer every type of research question. The old maxim of having good data is still relevant. Meaning that you want access to appropriate data with the right quality for the question(s) you want to answer. For many the speed and relatively low-cost is a trade-off against the quality of research data. This is often considered a price worth paying. In the age of ‘Big Data’ it is assumed by many that all you need to do is find the right big data set to address your question. However, if you think about this a little you will see why this assumption is flawed. Your research question might demand a different approach. For example, if you want to understand reasons for something happening the way it does a large data set suited to answering a different sort of question such as ‘what’ not ‘why’ may not be appropriate.
Planning is the key to doing good research. The first step is to establish your research aim and specific objectives to be achieved. This should give you some idea about the nature of the data you will need to access. Clues may also be given by others who have completed similar types of study to yours. Research reports indicate what works and what has been less successful, so read them. If you have a good idea before you rush to implement it, establish viability. Why has no one else done this? Is it feasible, sensible or are there potential problems with the approach you are proposing?
Once you have decided what data sources you will use the next step is to consider what you will do with the data once it is collected. What type of analysis will you need to do? Will it be quantitative or qualitative? Are there particular techniques or tools that you can employ? Do you have the necessary skills and training to do the analysis? Will you need help? These are some options to think about carefully before you progress to far.
Social media data tends to be unstructured and you need to think about how your research study can provide structure to make sense of the data. Combining different types of data may provide your research study with better understanding of the research question. For example, you may gain better insights if you combine survey data to establish measures or hypotheses with transaction data or observations and qualitative methods such as interview data (individual or focus groups) to understand opinions of your participants. The latter regarding opinion is where social media is able to provide plenty of that. The caveat of course is have you accurately identified the target audience that you want to glean data from. If you do not do this carefully the research will be flawed. For example, if you wanted to know the potential to increase your market share for bicycles and you simply target existing owners of bicycles the research would be limited. You should really be asking different groups of people who may be persuaded to buy a bicycle in future such as those using other means of transport or doing other forms of exercise.
I began by stating the principles of research design apply to social media as much as any other data source. I trust this article has given you reasons for my statement. You can read more about how market research is moving forward in this are by accessing the article below.
Fracking is a hydraulic process of extracting energy from geological stores underground particularly gas and oil. The arguments for, suggest that it will add to cheaper sources of energy, is relatively efficient and is sited close to where it is consumed in most cases. Short supply chains and a quick fix to the energy shortages are driving interest. The downside is it requires large quantities of water and chemicals that replace the extracted gas. It is claimed that the process has produced earthquakes and concerns over this aspect remain. The technique has been developed since 1947 and it continues to develop whether it will provide enough energy to keep the lights on is debatable. Many people believe that a return to older sources of extraction and energy generation will be necessary to do that. Germany has invested in coal fired power to keep their lights on. So who’s right?
Most energy produced in developed and developing economies alike still relies mainly on coal. Oil is only number two. So what is the problem with Fracking?
Fracking it has been suggested risks tremors and quakes. However, more importantly it is expensive, not ecologically friendly, non-renewable and warms the planet rather than slow it down. The benefit suggested by proponents of Fracking suggests burning gas emits roughly half of the CO2 that coal would. Research suggests this is not so and the benefits illusory. Pumping water into shale to release trapped gas in the earth releases methane into the atmosphere and this is the problem. Burning coal although releasing sulphur dioxide and black carbon actually cools the planet and offsets the warming effect of the gas it generates by 40 per cent. Fracking would increase warming immediately and take about 100 years to be equivalent to coal.
So is fracking the way to go? The science might suggest not. The potential environmental cost is high while the potential benefits as a solution to medium and longer term energy demand suggest it will make a small contribution to the total energy requirements.
One of the biggest challenges facing all governments is energy. As energy exploration increasingly moves to difficult geographical terrain and is moved greater distances in larger volumes there are difficult challenges ahead. Do they have the technology? What is the real ‘cost’ of energy? How are supply chains managed? Are they sustainable? Do consumers care? And what are the environmental challenges?
New methods of exploration and extraction are necessary when oil is beneath ice. Increasingly large oil companies have faced these challenges daily in the Arctic. Technology is built on that developed for North Sea oil with drilling platforms on ice. The estimated cost of the BP ‘Deepwater Horizon’ disaster in 2010 in the Gulf of Mexico is $50 billion. When offshore oil wells blow out they can spue 50/60,000 barrels a day and it is all waste that needs cleaning up as governments with responsibility for the territory fight it out with the oil companies for compensation. Conditions in the Arctic are much more hostile than the Gulf of Mexico. When a spill occurs it is usually detected by satellite-mounted synthetic aperture radar (SAR). The technology is robust and works by bouncing radio waves from an orbiting satellite on the sea. In the Arctic SARs are less use because floating ice behaves just like an oil slick and it is unable to discriminate between them. The current technology is ineffective if there is more than 30 per cent ice cover. In this climate infrared or ultraviolet scanners are required because they can discriminate but they have to be carried and detected by ships or planes. Robot submarines known as Automated Underwater Vehicles (AUVs) are also pioneering detection.
As energy becomes more expensive the growing demand from global industry makes hostile environments more attractive propositions. Risk of disrupting supplies increases as distances become greater to transport energy from its original location to where it is consumed. These disruptions increasingly have included oil spills and the damage that causes to environments. Do consumers care about cost and which cost exactly do they care about? Usually the economic cost and seldom the social or environmental costs unless forced to do so through taxes. Taxation is a means to regulate industry but it is not easy. Boundaries and responsibilities are blurred. Enforcement is weak in neo liberal economies committed to free markets. Penalties are often watered down when it comes to oil. The politics of oil and energy is tricky.
Back in 1965 Gordon Moore the founder of Intel said that the number of transistors on circuits had doubled every year since the integrated circuit was invented. Thus, making miniaturization a possibility. The industry mantra since has been to make things smarter, faster, cheaper and smaller. Is all that about to end? Press reports suggest that miniaturization is at a cross roads. Photographic processes have until now been able to lay patterns on circuit boards increasing the number of transistors on a silicon chip. These chips are getting congested with the current method of achieving the necessary patterns to get more on less. There is a new way to do it using Extreme Ultraviolet (EUV). EUV requires more investment ($100 million) with the technology costing twice as much as the current machines used to photographically produce the chips. This is a gloomy prognosis for continuing miniaturization from the industry.
However, is it all as gloomy as predicted by the industry? In the past few years smart phones have actually increased in size and other devices such as tablets which are larger have the potential to pack more into their 7 and 10.1 inch frames. Perhaps the industry has more time than it forecasts to innovate for the new era of mass manufacturing using EUV. Less is more through innovation and so perhaps Moore’s law prevails in the tablet market?