Tuesday, August 6, 2019
Harley-Davidson motorcycle Essay Example for Free
Harley-Davidson motorcycle Essay Introduction Harley-Davidson Motorcycles have been around for just over 100 years. They became popular after World War II and had continued success until the 1970A? aââ¬Å¡Ã ¬aââ¬Å¾? s when the company was sold. In 1981 a group of executives bought the company and turned it around into what it is today. Harley-Davidson has had some hard times and some images to shed to get to where it is now, but it has been and are still the front runner in the motorcycle industry. Industry and Competition Analysis A? aââ¬Å¡Ã ¬A? General economic characteristics. The current market for Harley-Davidson motorcycles is mostly baby boomers who want to recapture the freedom of when they were young. Most consumers today are middle to upper class. A? aââ¬Å¡Ã ¬A? Driving Forces The main driving force behind changes made in Harley-Davidson is image. The company has had to continually change to fit or change an image that accompanies the products A? aââ¬Å¡Ã ¬A? Five Forces Model There is not much rivalry between Harley-Davidson and other manufactures in the United States. The only company that comes close to competing with them is Honda, however Honda does not specialize in the same type of motorcycle that Harley-Davidson does. The only substitute product would be a different type of motorcycle such as a street bike, but Harley-Davidson mainly produces touring bikes. A? aââ¬Å¡Ã ¬A? Competitor Analysis There really are not any true competitors in the United States. BMW is competition in Europe as well as other companies that manufacture street or racing bikes, but as far as touring motorcycles, Harley-Davidson does not have any real competition. A? aââ¬Å¡Ã ¬A? Key Success Factors Key success factors include marketing to improve or change image, production to produce enough bikes for consumer demand and distribution, having the right dealerships in place to sell the product. 2 A? aââ¬Å¡Ã ¬A? Attractiveness This is a very attractive industry for Harley-Davidson; however it would not be a good industry for a new company to try to gain entry into due to such large brand loyalty. Company Situation Analysis Harley-DavidsonA? aââ¬Å¡Ã ¬aââ¬Å¾? s business strategy to become more profitable is to market a lifestyle instead of a product. They show the Harley image as being free and fun. Consumers want to get that feeling and while most people buy the motorcycles, Harley-Davidson has managed to create a market for their products even for people who do not own a motorcycle. Harley-DavidsonA? aââ¬Å¡Ã ¬aââ¬Å¾? s main strength is brand loyalty. People see a motorcycle they automatically think of Harley-Davidson. Their weakness however is diversification. While they are extremely successful at producing and selling touring motorcycles, they can not capture the market on other types of motorcycles. Suzuki and Honda are better known for their street bikes that HarleyA? aââ¬Å¡Ã ¬aââ¬Å¾? s Buell Company. This leads to their opportunities. Harley-DavidsonA? aââ¬Å¡Ã ¬aââ¬Å¾? s biggest opportunity is to develop a street bike that can compete effectively with a Honda or Suzuki motorcycle. Threats to Harley-Davidson include a changing culture. As Baby Boomers are becoming too old to purchase new motorcycles, they need to refocus their attention to the younger generation. In order to capture the younger generation, Harley-Davidson needs to develop a street bike as not many younger people are as interested in touring motorcycles. Financially, Harley-Davidson is doing very well. Revenues have grown at a rate of 16% over the past 10 years and have reached 4. 6 billion dollars. This is huge growth considering that the company almost went bankrupt in 1985. Key Issues That Need To Be Addressed The main issue that Harley-Davidson is facing is the aging of their buyers. As talked about earlier, the new consumers of motorcycles are going to be younger people who are looking more for performance motorcycles than touring ones. Right now Harley is expanding their current business keeping it the same as it has always been instead of diversifying the products that they manufacture. Harley has such a large brand loyalty and has finally changed the image of a Harley rider into a positive one that the company will be okay; there is just not much more room for growth. 3 Strategy Alternatives One potential strategy alternative would be to either merge or acquire another company. They have already tried doing this with Buell motorcycles, however, the Buell brand is not that well known and only make up . 8% of the market for motorcycles. If Harley-Davidson could acquire a smaller company such as Ducati that is already successful in Europe, it could market it better in the United States and gain even more market control over the industry and continue to increase profits. One weakness to this strategy however is image. Harley-Davidson has the All American image and buying a foreign company and marketing a foreign product could hurt the image to some Harley Owners. Recommendations Harley-Davidson is such a well known and well respected company that I do not necessarily think that they should change their strategy. They have already tried to incorporate new types of motorcycles into their product line with little success. While the V-Rod is vastly different from the bikes they previously made, sales are not as high as on the traditional motorcycles. Their only real option is to better market the V-Rod and Buell motorcycles to try to build a new customer base so that they can increase sales if sales really do decrease on their traditional touring motorcycles.
Credit Risk Management in the UK Banking Sector
Credit Risk Management in the UK Banking Sector Background 3 Literature Review 7 Ascertaining why and how banking credit risk exposure is evolving recently 8 Seeing how banks use credit risk evaluation and assessment tools to mitigate their credit risk exposure 11 The steps and methodologies used by banks to identify, plan, map out, define a framework, develop an analysis and mitigate credit risk 13 Determine the relationship between the theories, concepts and models of credit risk management and what goes on practically in the banking world 17 Ascertain the scope to which resourceful credit risk management can perk up bank performance 19 To evaluate how regulators and government are assisting the banks to identify, mitigate credit risk, and helping to adopt the risk-based strategies to increase their profitability, and offering assistance on continuous basis 20 Research Methodology 21 Analysis 23 Ascertaining why and how banking credit risk exposure is evolving recently 23 Seeing how banks use credit risk evaluation and assessment tools to mitigate their credit risk exposure 25 The steps and methodologies used by banks to identify, plan, map out, define a framework, develop an analysis and mitigate credit risk 31 Determine the relationship between the theories, concepts and models of credit risk management and what goes on practically in the banking world 35 Ascertain the scope to which resourceful credit risk management can perk up bank performance 38 To evaluate how regulators and government are assisting the banks to identify, mitigate credit risk, and helping to adopt the risk-based strategies to increase their profitability, and offering assistance on continuous basis 40 Primary Survey 45 Conclusions 46 Recommendations 50 Bibliography 56 Background The sub-prime mortgage meltdown that hit the global banking sector in 2007, was a result of circumstances, actions and repercussions that began years earlier (Long, 2007). It, the sub-prime mortgage crisis, was based on unsound ground from its inception. Sub-prime mortgages represent loans made to borrowers that have lower ratings in their credit than the norm (investopedia, 2007). Due to the lower borrower credit rating, they do not qualify for what is termed as a conventional mortgage due to default risk (investopedia, 2007). Sub-prime mortgages thus carry a higher interest rate to off set the risk increase, which helped to fuel the United States economy through increased home ownership, and the attendant spending that accompanies it (Bajaj and Nixon, 2006). Implemented by the Bush administration in the United States to get the economy rolling after the recession fuelled by the September 11th air attacks, the entire plan began to backfire as early as 2004 as a result of the continu ed building of new housing without the demand (Norris, 2008). The new construction glutted the market bringing down house prices. This, coupled with a slowing economy in the United States resulted in layoffs, as well as many subprime mortgage holders defaulting on their loans, and the crisis ballooned. Some attribute the over lending of subprime mortgages to predatory lending (Squires, 2004, pp. 81-87) along with the underlying faults of using it as an economic stimulus package that did not control the limits on new housing (Cocheo, 2007). That set of circumstances represented the cause of the subprime mortgage crisis that spread globally as a result of the tightening of credit due to defaulted loan sell offs and restricted banking lending ceilings caused by the Basel II Accords (Peterson, 2005). The complexity of the foregoing shall be further explained in the Literature Review section of this study. The preceding summary journey through the subprime mortgage crisis was conducted to reveal the manner in which banking credit crunches can and do occur. The significance of the foregoing to this study represents an example to awaken us to the external factors that can and do cause banking credit crisis situations, thus revealing that despite good management practices such events can m anifest themselves. It is also true that poor or lax banking practices can have the same effects. Credit risk management represents the assessing of the risk in pursuing a certain course, and or courses of action (Powell, 2004). In addition to the foregoing U.S. created subprime mortgage crisis, the appearance of new forms of financial instruments has and is causing a problem in credit risk management with regard to the banking sector. As the worlds second largest financial centre, the United Kingdom is subject to transaction volumes that increase the risks the banking sector takes as so many new forms of financial instruments land there first. McClave (1996, p. 15) provides us with an understanding of bank risk that opens the realm to give us an overview of the problem by telling us: Banks must manage risk more objectively, using quantitative skills to understand portfolio data and to predict portfolio performance. As a result, risk management will become more process-oriented and less dependent on individuals. Angelopoulos and Mourdoukoutas (2001, p. 11) amplify the preceding in stating that Banking risk management is both a philosophical and an operational issue. They add: As a philosophical issue, banking risk management is about attitudes towards risk and the payoff associated with it, and strategies in dealing with them. As an operational issue, risk management is about the identification and classification of banking risks, and methods and procedures to measure, monitor, and control them. (Angelopoulos and Mourdoukoutas, 2001, p. 11) In concluding, Angelopoulos and Mourdoukoutas (2001, p. 11) tell us that the two approaches are in reality not divorced, and or independent form each other, and that attitudes concerning risk contribute to determining the guidelines for the measurement of risk as well as its control and monitoring. The research that has been conducted has been gathered to address credit risk management in the United Kingdom banking sector. In order to equate such, data has been gathered from all salient sources, regardless of their locale as basic banking procedures remain constant worldwide. References specific to the European Union and the United Kingdom were employed in those instances when the nuances of legislation, laws, policies and related factors dictated and evidenced a deviance that was specific. In terms of importance, credit risk is one of the most important functions in banking as it represents the foundation of how banks earn money from deposited funds they are entrusted with. This being the case, the manner in which banks manage their credit risk is a critical component of their performance over the near term as well as long term. The implications are that todays decisions impact the future, thus banks cannot approach current profitability without taking measures to ensure that decisions made in the present do not impact them negatively in the future (Comptroller of the Currency, 2001). A well designed, functioning and managed credit risk rating system promotes the safety of a bank as well as soundness in terms of making informed decisions (Comptroller of the Currency, 2001). The system works by measuring the different types of credit risk through dividing them into groups that differentiate risk by the risk posed. This enables management as well as bank examiners to mon itor trends and changes to risk exposure, and this minimise risk through diversifying the types of risk taken on through separation (Comptroller of the Currency, 2001). The types of credit risks a bank faces represents a broad array of standard, meaning old and establishes sources, as well as new fields that are developing, gaining favour, and or impacting banks as a result of the tightness of international banking that creates a ripple effect. The aforementioned subprime crisis had such an effect in that the closeness of the international banking community accelerated developments. The deregulation of banking has increased the risk stakes for banks as they now are able to engage in a broad array of lending and investment practices (Dorfman, 1997, pp. 67-73). Banking credit risk has been impacted by technology, which was one of the contributing factors in the subprime crisis (Sraeel, 2008). Technology impacts banks on both sides of the coin in that computing power and new software permits banks to devise and utilise historical risk calculations in equating present risk forms. However, as it is with all formulas, they are only as effective as the par ameters entered (Willis, 2003). The interconnected nature of the global banking system means that bank risk has increased as a result of the quick manner in which financial instruments, credit risk transfer, and other systems, and or forms of risk are handled. The Bank for International Settlements led a committee that looked into Payment and Settlement Systems, which impacts all forms of banking credit risk, both new forms as well as long standing established ones in loans, investments and other fields (TransactionDirectory.com, 2008). The report indicates that while technology and communication systems are and have increased the efficiency of banking through internal management as well as banking systems, these same areas, technology and communications systems also have and are contributing to risk. The complexity of the issues that arise in a discussion of credit risk management means that there are many terms that are applicable to the foregoing that are banking industry specific to this area. In presenting this material, it was deemed that these special terms would have more impact if they were explained, in terms of their context, as they occur to ease the task of digesting the information. This study will examine credit risk management in the UK banking sector, and the foregoing thus will take into account banking regulations, legislation, external and internal factors that impact upon this. Literature Review The areas to be covered by this study in relationship to the topic area Credit Risk Management in the UK Banking Sector entails looking at as well as examining it using a number of assessment and analysis points, as represented by the following: Ascertaining why and how banking credit risk exposure is evolving recently. Seeing how banks use credit risk evaluation and assessment tools to mitigate their credit risk exposure. The steps and methodologies used by banks to identify, plan, map out, define a framework, develop an analysis and mitigate credit risk. Determine the relationship between the theories, concepts and models of credit risk management and what goes on practically in the banking world. Ascertain the scope to which resourceful credit risk management can perk up bank performance. To evaluate how regulators and government are assisting the banks to identify, mitigate credit risk, and helping to adopt the risk-based strategies to increase their profitability, and offering assistance on continuous basis. The foregoing also represents the research methodology, which shall be further examined in section 3.0. These aspects have been included here as they represented the focus of the Literature Review, thus dictating the approach. The following review of literature contains segments of the information found on the aforementioned five areas, with the remainder referred to in the Analysis section of this study. Ascertaining why and how banking credit risk exposure is evolving recently. In a report generated by the Bank for International Settlements stated that while transactional costs have been reduced as a result of advanced communication systems, the other side of this development has seen an increase with regard to the potential for disruptions to spread quickly and widely across multiple systems (TransactionDirectory.com, 2008). The Report goes onto add that concerns regarding the speed in which transactions occur is not reflected adequately in risk controls, stress tests, crisis management procedures as well as contingency funding plans (TransactionDirectory.com, 2008). The speed at which transactions happen means that varied forms of risk can move through the banking system in such a manner so as to spread broadly before the impact of these transactions is known, as was the case with the subprime mortgage crisis debt layoff. One of the critical problems in the subprime crisis was that it represented a classic recent example of the ripple effect caused by rapid interbanking communications, and credit risk transfer. When the U.S. housing bubble burst, refinance terms could not cover the dropping house prices thus leading to defaults. The revaluation of housing prices as a result of overbuilding forced a correction in the U.S. housing market that drove prices in many cases below the assessed mortgage value (Amadeo, 2007). The subprime mortgage problem was further exacerbated by mortgage packages such as fixed rate, balloon, adjustable rate, cash-out and other forms that the failure of the U.S. housing market impacted (Demyanyk and Van Hemert, 2007). As defaults increased banks sold off their positions in bad as well as good loans they deemed as risks as collateralised debt obligations and sold them to differing investor groups (Eckman, 2008). Some of these collateralised debt obligations, containing subprim e and other mortgages, were re-bundled and sold again on margin to still another set of investors looking for high returns, sometimes putting down $1 million on a $100 million package and borrowing the rest (Eckman, 2008). When default set in, margins calls began, and the house of cards started caving in. Derivatives represent another risk form that has increased banking exposure. The preceding statement is made because new forms of derivatives are being created all of the time (Culp. 2001, p. 215). Derivatives are not new, they have existed since the 1600s in a rudimentary form as predetermined prices for the future delivery of farming products (Ivkovic, 2008). Ironically, derivatives are utilised in todays financial sector to reduce risk via changing the financial exposure, along with reducing transaction costs (Minehan and Simons, 1995). In summary, some of the uses of derivatives entail taking basic financial instruments as represented by bonds, loans and stocks, as a few examples, and then isolating basic facets such as their agreement to pay, agreements to receive or exchange cash as well as other considerations (financial) and packaging them is financial instruments (Molvar, et al, 1995). While derivatives, in theory, help to spread risk, spreading risk is exactly what caused t he subprime meltdown as the risk from U.S. mortgage were bundled and sold, repackaged, margined, and thus created a raft of exposure that suffered from the domino effect when the original house of cards came crashing down. Other derivative forms include currency swaps as well as interest rate derivatives that are termed as over the counter (Cocheo, 1993). The complexity of derivatives has increased to the point where: auditors will need to have special knowledge to be able to evaluate the derivatives measurement and disclosure so they conform with GAAP. For example, features embedded in contracts or agreements may require separate accounting as a derivative, while complex pricing structures may make assumptions used in estimating the derivative s fair value more complex, too. (Coppinger and Fitzsimons, 2002) The preceding brings attention to the issues in evaluating the risks of derivatives, and banks having the proper staffing, financial programs and criteria to rate derivative risks on old as well as the consistently new forms being developed. Andrew Crockett, the former manager for the Bank of International Settlements, in commenting on derivatives presented the double-edged sword that these financial instruments present, and thus the inherent dangers (Whalen, 2004) When properly used, (derivatives) can be a powerful means of controlling risk that allows firms to economize on scarce capital. However, it is possible for new instruments to be based on models, which are poorly designed or understood, or for the instruments to give rise to a high degree of common behaviour in traded markets. The result can be large losses to individual firms or increased market volatility. The foregoing provides background information that relates to understanding why and how banking credit risk exposure has and is evolving. The examples provided have been utilised to illustrate this. Seeing how banks use credit risk evaluation and assessment tools to mitigate their credit risk exposure. As credit risk is the focal point throughout this study, a definition of the term represents an important aspect. Credit risk is defined as (Investopedia, 2008): The risk of loss of principal orloss of a financial reward stemming from a borrowers failure to repay a loan or otherwise meet a contractual obligation. Credit risk arises whenever a borrower is expecting to use future cash flows to pay a current debt. Investors are compensated for assuming credit risk by way of interest payments from the borrower or issuer of a debt obligation. Risk, in terms of investments, is closely aligned with the potential return being offered (Investopedia, 2008). The preceding means that the higher the risk, the higher the rate of return expected by those investing in the risk. Banks utilise a variety of credit risk evaluation and assessment tools to apprise them of credit risk probabilities so that they can mitigate, and or determine their risk exposure. There are varied forms of credit risk models, which are defined as tools to estimate credit risk probability in terms of losses from banking operations in specific as well as overall areas (Lopez and Saidenburg, 2000, pp. 151-165). Lopez and Saidenberg (1999) advise us that the main use of models by banks is to provide forecasts concerning the probability of how losses might occur in the credit portfolio, and the manner in which they might happen. They advise that the aforementioned credit risk model projection of loss distribution is founded on two factors (Lopez and Saidenberg, 1999): the multivariate, which means having more than one variable (Houghton Mifflin, 2008) distribution concerning the credit losses in terms of all of the credits in the banks portfolio, and the weighting vector, meaning the direction, characterising these credits. As can be deduced, the ability to measure credit risk is an important factor in improving the risk management capacity of a bank. The importance of the preceding is contained in the Basel II Accord that states the capital requirement is three times the projected maximum loss that could occur in terms of a portfolio position (Vassalou, M., Xing, Y., 2003). Risk models and risk assessment tools form and are a structural part of the new Basel II Accord in that banks are required to adhere to three mechanisms for overall operational risk that are set to measure and control liquidity risk, of which credit risk is a big component (Banco de Espana, 2005). The key provisions of the Basel II Accord set forth that (Accenture, 2003): the capital allocation is risk sensitive, separation of operational risk, from credit risk, vary the capital requirements in keeping with the different types of business it conducts, and encourage the development and use of internal systems to aid the bank in arriving at capital levels that meet requirements An explanation of the tools utilised by banks in terms of evaluation as well as assessment will be further explored in the Analysis segment of this study. The steps and methodologies used by banks to identify, plan, map out, define a framework, develop an analysis and mitigate credit risk. The process via which banks identify, plan, map out, define frameworks, develop analyses, and mitigate credit risk represent areas as put forth by the Basel II Accord, which shall be defined in terms of the oversight measures and degrees of autonomy they have in this process. In terms of the word autonomy, it must be explained that the Basel II Accord regulates the standard of banking capital adequacy, setting forth defined measures for the analysis of risk that must meet with regulatory approval (Bank for International Settlements, 2007). This is specified under the three types of capital requirement frameworks that were designed to impact on the area of pricing risk to make the discipline proactive. The rationale for the preceding tiered process is that it acts as an incentive for banks to seek the top level that affords them with a lowered requirement for capital adequacy as a result of heightened risk management systems and processes across the board (Bank for International Settl ements, 2007). The foregoing takes into account liquidity (operational) risk as well as credit risk management and market risk. The risk management active foundation of the Basel II Accord separates operational risk from credit risk, with the foundation geared to making the risk management process sensitive, along with aligning regulatory and economic capital aspects into closer proximity to reduce arbitrage ranges (Schneider, 2004). The process uses a three-pillar foundation that consists of minimum capital requirements along with supervisory review as well as market discipline to create enhanced stability (Schneider, 2004). The three tiers in the Basel II Accord, consist of the following, which are critical in understanding the steps, and methodologies utilised by banks to identify, plan, map, define frameworks, analyse and mitigate risk (Bank for International Settlements, 2007): Standardised Approach This is the lowest level of capital adequacy calculation, thus having the highest reserves. Via this approach risk management is conducted in what is termed as a standardised manner, which is founded on credit being externally assessed, and other methods consisting of internal rating measures. In terms of banking activities, they are set forth under eight business categories (Natter, 2004): agency services, corporate finance, trading and sales, asset management, commercial banking, retail banking, retail brokerage, payment and settlement The methodology utilised under the standardised approach is based on operational risk that is computed as a percentage of the banks income that is derived from that line of business. Foundation Internal Rating Based Approach (IRB) (Bank for International Settlements, 2007) The Foundational IRB utilises a series of measurements in the calculation of credit risk. Via this method, banks are able to develop empirical models on their own for use in estimating default probability incidence for clients. The use of these models must first be reviewed and cleared by local regulators to assure that the models conform to standards that calculate results in a manner that is in keeping with banking processes in terms of outcomes and inputs to arrive at the end figures. Regulators require that the formulas utilised include Loss Given Default (LGD), along with parameters consisting of the Risk Weighted Asset (RWA) are part of the formulas used. Banks that qualify under this tier are granted a lower capital adequacy holding figure than those under the first tier. Advanced Internal Rating Based Approach (IRB) (Bank for International Settlements, 2007) Under this last tier, banks are granted the lowest capital adequacy requirements, if they qualify by the constructing of empirical models that calculate the capital needed to cover credit risk. The techniques, personnel and equipment needed to meet the foregoing are quite extensive, requiring a substantial investment of time, materials, funds, and personnel to accomplish the foregoing, thus this measure generally applies to the largest banks, that have the capability to undertake these tasks. As is the case under the Foundation Internal Rating Based Approach, the models developed must meet with regulator approval. Under this aspect of the Basel II provisions for this tier, banks are permitted to create quantitative models that calculate the following (Bank for International Settlements, 2007): Exposure at Default (EAD), the Risk Weighted Asset (RWA) Probability of Default (PD), and Loss Given Default (LGD). The above facets have been utilised to provide an understanding of the operative parameters put into place by Basel II that define the realm in which banks must operate. These tiers also illustrate that the depth of the manner in which banks identify, plan, map out, define frameworks, analyse and mitigate credit risks, which varies based upon these tiers. Under the Standardised Approach the formulas are devised by the regulators, with banks having the opportunity to devise their own models. Graphically, the preceding looks as follows: Chart 1 Basel II Three Pillars (Bank for International Settlements, 2007) Determine the relationship between the theories, concepts and models of credit risk management and what goes on practically in the banking world. The Basel Committee on Banking Supervision (2000) states that the goal of credit risk management is to maximise a banks risk adjusted rate of return by maintaining credit risk exposure within acceptable parameters. The foregoing extends to its entire portfolio, along with risk as represented by individual credits, and with transactions (Basel Committee on Banking Supervision, 2000). In discussing risk management theories, Pyle (1997)/span> states it is the process by which managers satisfy these needs by identifying key risks, obtaining consistent, understandable, operational risk measures, choosing which risks to reduce, and which risks to increase and by what means, and establishing procedures to monitor the resulting risk position. The preceding statement brings forth the complex nature of credit risk management. In understanding the application of risk it is important to note that credit risks are defined as changes in portfolio value due to the failure of counter parties to m eet their obligations, or due to changes in the markets perception of their ability to continue to do so (Pyle, 1997). In terms of practice, banks have traditionally utilised credit scoring, credit committees, and ratings in an assessment of credit risk (Pyle, 1997). Bank regulations treat market risk and credit risk as separate categories. J.P. Morgan Securities, Inc. (1997) brought forth the theory that the parallel treatment of market risk and credit risk would increase risk management by gauging both facets would aiding in contributing to the accuracy of credit risk by introducing external forces and influences into the equation that would reveal events and their correlation with credit risk. Through incorporating the influence and effect of external events via an historical perspective, against credit risk default rates, patterns and models result that can serve as useful alerts to pending changes in credit risk as contained in Pyles (1997)/span> statement that ended in due to changes in the markets perception of their ability to continue to do so. The Plausibility Theory as developed by Wolfgang Spohn represents an approach to making decisions in the face of unknowable risks (Value Based Management, Inc., 2007). Prior to the arrival of the Plausibility Theory, Bayesian statistics was utilised to predict and explain decision making which was based upon managers making decisions through weighing the likelihood of differing events, along with their projected outcomes (Value Based Management, Inc., 2007). Strangely, the foregoing this theory was not applied to banking. The Risk Threshold of the Plausibility Theory assesses a range of outcomes that may be possible, however it does focus on the probability of hitting a threshold point, such as net loss relative to acceptable risk (Value Based Management, Inc., 2007). The new Basel II Accord employs a variant of the foregoing that is termed as Risk Adjusted Return on Capital which is a measurement as well as management framework for measuring risk adjusted financial performance and for providing a consistent view of profitability across business (units divisions) (Value Based Management, Inc., 2007). The foregoing theory of including external events in a calculative model with business lines credit risks is yet to be fully accepted as the variables from external predictive models to result in scenarios along with credit risk models is a daunting set of equations. Ascertain the scope to which resourceful credit risk management can perk up bank performance. In equating how and the scope in which resourceful credit risk management can improve bank performance, one needs to be cognizant that credit risk represents the primary type of financial risk in the bank sector as well as existing in almost all areas that are income generating (Comptroller of the Currency, 2001). From the preceding it flows that a credit risk rating system that is managed and run well will and does promote bank soundness as well as safety through helping to make and implement decision making that is informed (Comptroller of the Currency, 2001). Through the construction and use of the foregoing, banking management as well as bank examiners and regulators are able to monitor trends as well as changes occurring in risk levels (Comptroller of the Currency, 2001). Through the preceding, management is able to better manage risk, thus optimising returns (Comptroller of the Currency, 2001). The improvement of credit risk management in terms of identification and monitoring, the process when operated effectively can improve bottom line performance through laying off risk identified as potentially being problematic in the future (KPMG, 2007). Zimmer (2005) helps us to understand the nuances of transferring credit risk by telling us: A bank collects funds and originates loans. It might only be able to attract funds if it holds some risk capital that finances losses and saves the bank from insolvency if parts of its loan portfolio default. If the bank faces increasing costs of raising external finance, CRT has a positive effect on the lending capacity of the bank. Providing the bank with additional risk capital, CRT lowers the banks opportunity cost of additional lending and increases its lending capacity. As has been covered herein, credit risk represents a potential income loss area for banks in that default subtracts from income, thus lowering a banks financial performance. The Bank for International Settlements (2003) advises that the principle cause of banking problems is directly related to credit standards that are lax, which is termed as poor risk management. The preceding reality has been documented by the The Bank for International Settlements (2003) that advises that poor credit risk management procedures and structures rob banks of income as they fail to identify risks that are in danger of default, and thus taking the appropriate actions. A discussion of the means via which resourceful credit risk management enhance bank performance in delved into under the Analysis segment of this study. To evaluate how regulators and government are assisting the banks to identify, mitigate credit risk, and helping to adopt the risk-based strategies to increase their profitability, and offering assistance on continuous basis. In delving into banking credit risk management in the United Kingdom, legislation represents the logical starting place as it sets the parameters and guidelines under which the banking sector must operate. The Basel II Accord represents the revised i
Monday, August 5, 2019
Analyzing A Selection Of Childrens Literature English Literature Essay
Analyzing A Selection Of Childrens Literature English Literature Essay With the birth of the field of childrens literature over two centuries old, Carnegie Medal winners represent only a small part of the history and tradition of childrens literature. The Graveyard Book (2009), the most recent addition recipient of the award, follows some of the traditions of the field, and differs in others. In my attempt to discuss how The Graveyard Book fits into the history and tradition of childrens literature, I will be comparing it with other notable works in the field, specifically, Robert Louis Stevensons Treasure Island (1883), J. K. Rowlings Harry Potter and the Philosophers Stone (1997), and Philippa Pearces Toms Midnight Garden (1958). As a fellow Carnegie winner, Toms Midnight Garden, offers a comparison of fantasy fiction, and when considered with Harry Potter and the Philosophers Stone provides an interesting view of the changes that have occurred in the genre over the years. Treasure Island is structured similarly to The Graveyard Book, and both novels are good examples of the bildungsroman genre. In the course of this essay I will be referring to a range of critical material relevant to my discussion. The effect of childrens literature on children, and the reverse, is circular; as childrens attitudes to the world around them change, so too does the literature written for them, and as that literature changes, it again affects childrens attitudes. Furthermore, the evolution of adults understanding of childhood has affected which books are deemed suitable for publication. Childrens literature commonly exemplifies the beliefs and context of the culture in which it is written, however, since the majority of childrens literature is written by adults, it often reflects issues that concern adults, and not the intended audience. Adult-authors must make assumptions about the reaction of a child-reader or the behaviour of a child-protagonist, and in doing so, can sometimes offer a poor representation of a childs perspective. This difference between the adults and childs attitude to childrens literature can often be seen in the contrast between best-selling books, and those books that win lit erary prizes. Contrary to this, The Graveyard Book has won the Newbery Medal, Hugo Award for Best Novel, and the Locus Award for Best Young Adult novel in 2009, and the 2010 Carnegie Medal (Wikipedia contributors, 2011), spent fifteen weeks on the New York Times best-seller list for childrens chapter books (Rich, 2009), and has a film adaptation currently in production (Wikipedia contributors, 2011). Gaiman himself recognized the unusual nature of a book being both popular and prestigious, saying that typically there are books that are best sellers and books that are winners (Gaiman quoted in Rich, 2009). The popularity and prestige of a childrens book is dependent on a number of different elements; instruction and/or delight, and social, cultural and historical contexts (Maybin, 2009, p. 116). Maybin states that prizes signify a books prestige in the eyes of the critics, but they are not necessarily an indication of its appeal to children (Maybin, 2009, p. 118). The division between the childrens books awarded literary prizes, and those that are popular with children is significant. An example of such division can be seen when comparing Philip Pullmans Northern Lights (1995) and Rowlings Harry Potter and the Philosophers Stone; Northern Lights was the 1995 Carnegie Medal winner, while Harry Potter and the Philosophers Stone only reached the shortlist for the 1997 Medal, but went on to win the Nestlà © Smarties Book Prize, The British Book Award for Childrens Book of the Year and the Childrens Book Award, all of which, suggestively, have involved children in the judging process. Like The Graveyard Book, both books are fantasy-adventure novels featuring a young protagonist. All three novels are read an enjoyed by adults and children, but while Northern Lights is considered by adults to be quality literature, Harry Potter is criticised being not literature but a phenomenon (Zipes, 2009, p. 289). Nicholas Tucker (2009) argues that the criteria for judging the quality of childrens books varied according to conceptions of childhood; for those with a romantic conception, the emphasis is on an exciting, imaginative storyline, whilst those who view childhood primarily as preparation for adulthood favour books that are truly representative (Tucker, 2009, p. 153). If compared to earlier childrens books, it appears that modern childrens literature reflects the development of a clearer concept of childhood. The debate surrounding instruction and delight in childrens literature is one that has occupied scholars for centuries. The first childrens book to combine the two concepts was A Little Pretty Pocket-Book (1744), published by John Newberry, and featuring the motto, deluctando monemus instruction with delight. (Montgomery, 2009, p. 13) Prior to A Little Pretty Pocket-Book, the majority of childrens literature was Puritan in nature, and advocated childrens conversion to Christianity in order to save their souls from eternal damnation. Their concept of original sin resulted in explicitly didactic literature intended to educate children both religiously and morally. Newberys childrens book was, according to Jack Zipes, the first childrens book in which amusement rather than religious indoctrination is the central concern (Montgomery, 2009, p. 13). In contrast to the clearly religious books generated by the Puritans, Newberys books appealed to parents more interested in social and financ ial improvement; Letter to Sir declares that learning is a most excellent thing and can raise a boy from a mean State of Life to a Coach and Six (Montgomery, 2009, p. 14). A Little Pretty Pocket-Book marks the beginning of an evolution of the purpose of childrens literature into a concern more for the moral development of a child; with an emphasis on becoming a good person for the sake of ones emotional well-being rather than for fear of eternal damnation. The Bildungsroman novel, considered to have begun with the publication of Johann Wolfgang von Goethes The Apprenticeship of Wilhelm Meister in 1795-6, emphasizes this psychological development. The genre is generally distinguished by a number of topical and thematic elements (Iversen, 2009), and narrates the protagonists maturation over the course of the novel. The protagonist is usually young, and, following early unhappiness leaves home on a long and demanding journey, along the way maturing into a self-aware, socially-responsible young adult. Structurally, a Bildungsroman will often favour inter-character dialogue over extensive plot development, which causes the readers attention to be centered firmly on the protagonist. Whilst a Bildungsroman is deemed to be a German novel, many scholars use the term (spelled without a capital) to refer to other novels of a similar style that have been published elsewhere. With this in mind, it can be reasoned that The Graveyard Book follows the traditions of a bildungsroman novel. The Graveyard Book incorporates a number of the elements present in other coming-of-age novels indeed, Gaiman himself has admitted that the novel was greatly influenced by Kiplings The Jungle Books (1894), which may be considered one of the best-known of such novels (Horn, 2010). Gaiman described the idea as, something a lot like The Jungle Book and set it in a graveyard (Gaiman quoted in Rich, 2009). The similarities between the two books are clear; in the book titles, the protagonist, even in individual chapters, for example the comparisons between the third chapter in The Graveyard Book, The Hounds of God and the second chapter in Book One of The Jungle Books, Kaas Hunting. Gaimans ability to take the premise of a popular book over a hundred years old and develop it into an enjoyable childrens book that is both modern and relevant, demonstrates how the traditions of childrens literature can be transformed to meet the demands of a new audience. A further example of the on-going tradition of the coming-of-age novel is the Harry Potter series, specifically Harry Potter and the Philosophers Stone. The protagonists in both the Harry Potter novels and The Graveyard Book are orphaned as babies when their parents/family are killed by a murderer who, after failing to kill them, continues to hunt them until the two meet in a final show-down. This premise features in numerous books for children throughout the history of childrens literature, from the already mentioned Jungle Books to Lemony Snickets A Series of Unfortunate Events (1999-2006). The similarities between Harry Potter and Voldemort and Nobody Owens and the man Jack extend further than the latters desire to kill; the plots of both novels build from the murder of the protagonists family, and in both cases these murders are prompted by a prophecy that the protagonist would be the downfall of the antagonist. This concept of the child-hero is a popular one in childrens fiction and features throughout the history of childrens literature, from Wart in T. H. Whites The Sword in the Stone (1938) to Percy Jackson in Rick Riordans Camp Half-Blood series. Orphaned (whether literally or figuratively) protagonists appear frequently in childrens literature, from folk tales to contemporary fiction. A valuable literary device, an orphan provokes sympathy and can generate a perceived alliance between protagonist and reader. An orphaned child protagonist can also be convenient for the author since without parents, the budding child hero has more freedom to experience the, sometimes life-threatening, adventures that encourage his maturation. This can be seen in Toms Midnight Garden, the 1958 winner of the Carnegie Medal, and one of the Carnegie Medal 70th Anniversary top ten (The CILIP Carnegie Kate Greenaway Childrens Book Awards, 2007). Tom is able to visit the garden partly because of the absence of his parents whilst he is being cared for by his aunt and uncle, it is clear from the novel that neither adult is accustomed to caring for a child, and Tom takes advantage of this to pursue his nightly visits to the garden. Whilst Tom can be c onsidered a temporary orphan in a figurative sense, Hatty is literally an orphan, having lost both of her parents at a young age. Their status as orphans is not the only thing that Hatty and Bod share; as Hatty grows up, she ceases to see Tom, in the same way that Bod ceases to see the residents of the graveyard. Alison Waller (2009) argues that in young adult fiction the ending is always presumed to be a realisation of adulthood and maturity. (Waller, 2009, p. 54) This idea of maturation is reminiscent of Barries Peter Pan (1911) and Wendys realisation that she and her brothers cannot stay in Never Land, but must return home to grow-up. Humphrey Carpenter (1985) compares Tom and Peters attitudes to their ageing, arguing that the storys conclusion describes Toms acceptance of what Peter Pan can never accept: that Time must be allowed to pass, and growth and even old age must be accepted as necessary and even desirable facets of human nature (Carpenter, 1985). Like his predecessors in the traditions of the coming-of-age novel, Nobody Bod Owens is a likeable character, intriguing, and often contradictory in his behaviour; obedient, yet always questioning, determined, yet often managing to find trouble, courageous, yet sensitive. Happy as he is with his adoptive family in the graveyard, at the end of the novel, when he has become a young man, Bod declares that he want[s] to see lifeà ¢Ã¢â ¬Ã ¦ I want everything (Gaiman, 2009, p. 286). While this journey of maturation shares a theme with Treasure Island, Bods declaration is in contrast to Jims final words which, rather than being optimistic at the possibility of future adventure, are fearfully reminiscent of the accursed island (Stevenson, 2008, p. 191). Structurally, The Graveyard Book and Treasure Island share some similarities; both novels centre around the adventures of a single, male protagonist, both novels can be described as being coming-of-age stories, and both novels have resolv ed endings. The novels differ in their point of view; where The Graveyard Book is generally narrated in the third-person, Treasure Island is narrated in the first-person, by Jim Hawkins. However, both novels do deviate from their standard narrative form there are several parts of The Graveyard Book where the events are recounted by either the man Jack or by Scarlett, and in Treasure Island, for chapters 16-18, Stevenson shifts the control of the narrative from Jim to Doctor Livesey. In an illustrated talk, Kim Reynolds suggests that childrens literature in its current state has been moulded by practices that began in the nineteenth century, and that whilst the content of books today differs significantly from those of the nineteenth century, there were still the same kinds of divisions then, that we have now in terms of what we might call good literature (Reynolds, EA300 DVD1, no. 5). A recurring theme in childrens literature across the years is the idea of home. Central to the domestic and school stories popular with girls in the nineteenth century, and to adventures stories popular with boys during the same period, home is either the setting for such novels, for example in Little Women, or a place of safety that the protagonist can return to after his adventures, like in Treasure Island. The Graveyard Book parts from this traditional notion of home; what should have been Bods place of safety became the place where the man Jack murdered his family, so home beca me a place that does not follow the traditional domestic image. When he leaves the graveyard as a young man, he realizes that if he does return, it will be a place, but it wont be home any longer (Gaiman, 2009, p. 286). Contrary to many earlier childrens novels advocating the traditional correlation of home and safety, in The Graveyard Book Bod is in fact safer among the dead in the graveyard, a place that is stereotypically considered scary or even dangerous. In the last two centuries, there has been a significant change in how ghosts are portrayed in childrens literature; early literature saw ghosts that were frightening, and used to teach children morals, while in contemporary literature they are just as likely to be friendly or even amusing. Both interpretations can be seen in the Harry Potter series, with the Bloody Baron representing the fearsome ghost, and Nearly Headless Nick representing the friendly. The tradition of friendly ghosts in childrens literature, such as those in The Graveyard Book, appears to have begun with William Pà ¨ne du Bois book, Elisabeth the Cow Ghost (1936) (Pearce, 1995). The appearance of ghosts in childrens fiction increased during the 1970s and 1980s, with a numb er of novels that used ghosts to teach their readers about historical events, and others that a child protagonist helping a ghost to accept his fate and move on. This is in direct contrast to The Graveyard Book, where it is Bod that has to move on into the world of the living, while the ghosts are left in the graveyard. The publication of The Graveyard Book follows a recent rise in the popularity amongst children and young adults of paranormal fiction. Fantasy fiction as it is today has been developing since the revival of folk and fairy tales in the early 1800s, advancing particularly during the First Golden Age of childrens literature. Modern fantasy tends to reject traditional sentimentality, exploring instead complex moral and sociological issues. In a similar way to modern realism, modern fantasy fiction has broached a number of taboo subjects, the most significant in The Graveyard Book, being death. In the early history of childrens literature, when death occurred in a book, it was often as a punishment, used to illustrate where the wrong path could lead. In contrast, in The Graveyard Book death is treated as a natural part of life not to be either welcomed or feared. However, unlike other childrens fiction that handles the subject, death in The Graveyard Book is largely regarded light-hearte dly, unlike for example in The Other Side of Truth, where their mothers death acts at the catalyst for Sade and Femis subsequent ordeals. The acceptance of subjects that have previously been considered taboo is, according to Rachel Falconer (2009) a result of changing conditions of contemporary childhood (Falconer, 2009, p. 373). The Graveyard Book encapsulates some of the major traditions of childrens literature and is reminiscent of some of the most noteworthy works in the history of the field. At the same time, the novel pushes the boundaries of what is accepted, unmasking a taboo subject and treating it positively but tastefully. A best-seller, the novel continues the current trend of paranormal fiction, and bridges the gap between the popular and the prestigious by winning numerous literary awards. Neil Gaimans description of his book as a book about life and childhood and the value of childhood (Gaiman quoted in Horn, 2010), places it firmly amongst the fields traditions, and the books double-win of the Newbery Medal and the Carnegie Medal gives it a significant role in the continuing development of the field of childrens literature.
Sunday, August 4, 2019
Subdivisions of Corneal Ulcers and Treatment :: Eyes Cornea Vision Health Medical Essays
Subdivisions of Corneal Ulcers and Treatment The eye is one of the vital organs in a human being. As seen on figure 1, the eye is composed of many different parts and function. The cornea is a clear covering over the colored iris and the pupil of the eye. The function of cornea is to help focus light on the retina and protect the iris, lens, etc. so that the eye can see. The cornea is best to compare with a standard contact lens. Although, the function of a cornea is to protect from harmful microorganisms, it is also vulnerable to those same unicellular organisms. One of the major diseases affecting the cornea is a corneal ulcer. A corneal ulcer is an ââ¬Å"non-penetrating erosion, or open sore in the outer layer of the cornea, the transparent area at the front of the eyeballâ⬠(Medlineplus). Corneal Ulcer has many different names, depending on the microorganism that causes the ulcer. Some of the major diseases include Bacterial Keratitis, Fungal Keratitis, Acanthamoeba Keratitis, and Herpes Simplex Keratitis. Bacteria, fungi, amoebae, and viruses are the prime cause for these diseases. These microorganisms settle in the cornea, grow, and feed on the cornea. This process causes a corneal ulceration. Contact lenses are the leading way these microorganisms enter the cornea (discussed later). There contains multiple symptoms in order to identify corneal ulceration. Some of these symptoms include the following: eye redness, tearing increases, vision impairs, eye burning, itching, and photophobia (sensitive to light) start to develop (Medlineplus). à à à à à Many different methods of detecting corneal ulcer are present at the doctorââ¬â¢s office. Visual acuity test, Slit-lamp test, and Shirmers (tear) test are some of the tests that a doctor conducts during eye examination. Visual acuity test allows the doctor to measure a personââ¬â¢s vision by reading the eye chart (figure 6). A Slit-lamp is a specialized magnifying microscope in which a doctor could examine the cornea, iris, and retina. Its use is to look in the interior of the eye with the built-in laser and a camera (figure 7). Shirmers test determines whether or not there is enough tears to keep the eye moist. Another methods of detecting for corneal ulcer are Keratometry (measurement of the cornea) and scraping of the ulcer for analysis (Medlineplus). à à à à à There are many different ways to treat corneal ulcer. Many times, corneal ulcer is treated in the doctorââ¬â¢s office using eye drops.
Saturday, August 3, 2019
The Importance Of The Human Genome Project Essay -- Science Genetics B
The Importance Of The Human Genome Project This is the outstanding achievement not only of our lifetime, but of human history. I say this, because the Human Genome Project has the potential to impact the life of every person on this planet. It is a giant resource that will change mankind, much like the printing press did. The famous words of Dr. James Watson resonated as a victory bell, signaling the successful completion of what many deemed the boldest undertaking in the history of biology: The Human Genome Project (2003). On the fiftieth anniversary of the day that forever changed science the day Watson and his colleague Francis Crick unraveled the secret of life, the structure of deoxyribonucleic acid the world was presented with another shocking discovery: the complete sequence of the human genome. Almost immediately, uproar swept throughout the science community and the world-at-large, as many believed that the solution to our problems had finally arrived the true secret of life the panacea that would dissipate the ominous clouds of disease and suffering. Yet, as often happens when a promising new idea is presented on tenuous grounds, the revelers had only heard a fraction of the entire story; their grand hopes were born primarily of imagination. But when all the celebratory confetti had cleared, there stood defiantly amidst all the hoopla voices of reason. Molecular anthropologist Jonathan Marks voice was one of these. In an excerpt from his literary work What It Means to be 98% Chimpanzee: Apes, People, and Their Genes, Marks undermines the importance of the Human Genome Project and our genes, advocating instead a more rational and moderate view of them. By exposing three of the Project s flaws, he hopes to convince... ...ealize that our genes are but one aspect of our history, that there are many other histories that are even more important it is a delusion to think that genomics in isolation will ever tell us what it means to be human (2001, paragraph 11). Indeed, everything is not solely in our genes. Works Cited Beckwith, J. (2002). Geneticists in society, society in genetics. In J. Alper (Ed.), The double-edged helix (pp. 39-57). Baltimore: The Johns Hopkins University Press. Lewontin, R.C. (1991). Causes and their effects. Biology as ideology: the doctrine of DNA (pp. 41-57). New York: HarperPerennial. Marks, J. (2002). The meaning of human variation. What it means to be 98% chimpanzee: apes, people, and their genes (pp. 88-95). Berkeley: University of California Press. Paabo, S. (2001). The human genome and our view of ourselves. Science Magazine 291, 1219-1220.
Friday, August 2, 2019
Essentials of Business Management Essays -- GCSE Business Marketing Co
Essentials of Business Management When Sam Walton opened the first Wal-Mart store in 1962, it was the beginning of an American success story that no one could have predicted. A small-town merchant who had operated variety stores in Arkansas and Missouri, Walton was convinced that consumers would flock to a discount store with a wide array of merchandise and friendly service. Hence, Wal-Mart's mission is to deliver big-city discounting to small-town America. Sam's Roots From humble, hard-working roots, Sam Walton built Wal-Mart Stores, Inc. into the largest, fastest-growing, and most profitable retailer in the world. A child of the Depression, Sam always worked hard. He would milk the cows, and by the age of eight, he started selling magazine subscriptions. When he turned 12, Sam took on a paper route that he continued well into his college days to support himself. Walton began his retail career at J.C. Penney in Des Moines, Iowa in 1940 making just $75 per month. In 1945, Sam borrowed $5,000 from his wife and $20,000 from his wife's family to open a Ben Franklin five and dime franchise in Newport, Arkansas. In 1950, he relocated to Bentonville, Arkansas and opened a Walton 5 . Over the next 12 years they built up and grew to 15 Ben Franklin Stores under the name of Walton 5 . Sam had plenty of new ideas. He liked to deal with the suppliers directly so he could pass the savings on to the customers. He later brought a new idea to Ben Franklin management that they should open discount stores in small towns. They rejected his idea. The First of 3054 Sam and his brother James (Bud) opened their first Wal-Mart Discount City store in Rogers, Arkansas in 1962. Walton and his wife Helen had to put up everything they had, including their house and property to finance the first 18,000 square-foot store. With gradual growth over the next eight years, they went public in 1970 with only 18 stores and sales of $44 million. While other large chains lagged behind, Wal-Mart soon grew rapidly in the 1970's, due to their highly automated distribution centers and computerization. By 1980, they were up to 276 stores with revenues of over $1.2 billion. Sam Walton's guiding philosophy for his stores from the beginning was to offer consumers a wide selection of goods at a discounted price. The company saved money by keeping advertising costs low... ...equests for no publicity. The Ten Commandments of Leadership by Sam Walton 1. Commit to your goals. 2. Share your rewards. 3. Energize your colleagues. 4. Communicate all you know. 5. Value your associates' contributions. 6. Celebrate your success. 7. Listen to everyone. 8. Deliver more than you promise. 9. Work smarter than others do. 10. Blaze your own path. Bibliography 1. Jon Heuy. Sam Walton: Made in America: My Story (New York: Doubleday, 1992) 2. Kenneth E. Stone, Competing With the Retail Giants, (Toronto: John Wiley & Sons Inc. 1995) 3. Vince, H. Trimble, Sam Walton: The Story Inside America's Richest Man (New York: Dutton, 1990) 4. www.SmartLeadership.com 5. Inc Magazine, Spies Like Us, Stemberg, Tom, with Gruner, Stephanie. August, 1998, p45-48 6. Inc Magazine, The Mentors, Welles, Edward O. June, 1998, p48-50 7. www.walmart.com 1996, 1997, 1998 Wal-Mart Stores, Inc. 8. Stone, Kenneth E. Competing With the Retail Giants. (New York: John Wiley & Sons Inc., 1995) 9. Taylor, D., Archer ,J.S. Up Against the Wal-Marts. (New York: AMACOM, 1994) 10. Microsoft Encarta 98. Samuel Walton
Thursday, August 1, 2019
Musical Genre Classification of Audio Signals Essay
Musical genres are categorized by human. It depends on human hearing. There are common characteristics shared by categories. These characteristics are related to instrumentation, rhythmic structure, and harmonic content of the music. Currently many music is still classified by manually. Automated system for musical genre classification can assist or replace manual work for classifying musical genre. In this paper, the automatic classification of audio signals into hierarchy of musical genres is explored. Three feature sets for representing timbral texture, rhythmic content and pitch content are proposed. Also propose classification through two-times KNN classification method and show enhancement of accuracy. Using two-time KNN classification method increases accuracy about 5% than one-time ââ¬â++++KNN classification which two-time KNN classification accuracy is 77.9% and one-time KNN classification accuracy is 73.3%. Index Terms ââ¬â Music classification, feature extraction, wavelets, KNN classification Table of Contents I. II. Introduction Music Modeling & Genre Segmentation III. Feature Extraction A. Timbral Texture Features i. ii. iii. iv. B. Spectral shape features Mel-frequency cepstral coefficients (MFCCs) Texture window Low-Energy features Rhythmic Features C. Pitch Content Features IV. Classification V. Evaluation and Discussion VI. References I. Introduction Musical genres are categorized by human. It depends on human hearing. There are common characteristics shared by categories. These characteristics are related to instrumentation, rhythmic structure, and harmonic content of the music. Genre classification is magnified when music industry moved from CD to web. In web music is distributed in large amount so importance of genre classification is magnified. Currently many music is still classified by manually. Automated system for musical genre classification can assist or replace manual work for classifying musical genre. In era of web, it enabled to access large amount of all kinds of data such as music, movies, news and so on. Music database has been grown exponentially since first perceptual coders early in the 90ââ¬â¢s. As database grows it demanded tools that can enable search, retrieve and handle large amount of data. Classifying musical genre was great tool for searching, retrieving and handling large music data base [1-3]. There are several more method such as music emotion classification [4], beat tracking [5], preference recommendation [6], and etc.. Musical genres classification (MGC) are created and used for categorized and describe music. Musical genre has no precise definitions or boundaries because it is categorized by human hearing. Musical genres classification are highly related to public marketing, historical and cultural factors. Different countries and organizations have different genre lists, and they even define the same genre with different definitions. So it is hard to define certain genres precisely. There is not an official specification of music genre until now. There are about 500 to 800 genres in music [7, 8]. Some researchers suggested the definition of musical genres classification [9]. After several attempt to define musical genres researchers figured out that it shares certain characteristics such as instrumentation, rhythmic structure, and pitch content. Genre hierarchies were created by human experts and they are currently used to classify music in the web. Auto MGC can provide automating classifying process and provide important component for complete music information. The most significant proposal to specifically deal with this task was released in 2002 [3]. S everal strategies dealing with related problems have been proposed in research areas. In this paper, automatic musical genre classification is proposed showed in Figure 1. For feature extraction, three sets of features for representing instrumentation (timberal), rhythmic content and pitch content are proposed. Figure 1 Automatic Musical Genre Classification II. Music Modeling & Genre Segmentation An untrained and non-expert person can detect the genre of a song with accuracy of 72% by hearing three-second segmentation of the song [11]. However computer is not design like human brain so it canââ¬â¢t process MGC like human. Despite whole song may somehow influence the representativeness of feature, using whole song can extract most of features that music has. Also to extract short segment of music for automation system is unsuited for the purpose because difficulty of finding exact time of music that represents genre of music. Without research finding certain section of music representing its characteristic using whole song to modeling is proper way to MGC. There are too many music genres used in web [7, 8]. Classification genre has to be simplified and in this paper proposed genres which are popular used in MP3 players in the market. Figure 2 Taxonomy of Music Genre III. Feature Extraction Feature extraction is the process of computing numerical representation that can be used to characterize segment of audio and classify its genre. Digital music file contains data sampled from analog audio signal. It has huge data size compared to its actual information. Features are thus extracted from audio signal to obtain more meaningful information and reduce the over-loading processing. For feature extraction three sets of features for representing instrumentation (timberal), rhythmic content and pitch content will be used [3]. 1. Timbral Texture Features The features used to represent timbre texture are based on the features proposed inà speech recognition. The following specific features are usually used to represent timbre texture. ââ Spectral shape features [1-3] Spectral shape features are computed directly from the power spectrum of an audio signal frame, describing the shape and characteristics of the power spectrum. The calculated features are based on the short time Fourier transform (STFT) and are calculated for every short-time frame of sound. There are several ways to extract feature with spectral shape feature. 1. Spectral centroid is centroid of the magnitude spectrum of STFT and its measure of spectral brightness.
Subscribe to:
Posts (Atom)