ebook img

ERIC EJ1077819: An Engineering Educator's Decision Support Tool for Improving Innovation in Student Design Projects PDF

2015·1.8 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ1077819: An Engineering Educator's Decision Support Tool for Improving Innovation in Student Design Projects

Advances in Engineering Education SUMMER 2015 An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects NUR OZGE OZALTIN MARY BESTERFIELD-SACRE AND RENEE M. CLARK University of Pittsburgh Pittsburgh, PA ABSTRACT Learning how to design innovatively is a critical process skill for undergraduate engineers in the 21st century. To this end, our paper discusses the development and validation of a Bayesian network decision support tool that can be used by engineering educators to make recommendations that positively impact the innovativeness of product designs. Our Bayesian network model is based on Dym’s design process framework and actual design process data collected from 26 undergraduate engineering capstone teams over multiple terms. Cross validation using all available outcomes data and a sensitivity analysis showed our model to be both accurate and robust. Our model, which is based on data from teams that produced both innovative and non-innovative products, can be used to formatively assess the process used by a design team and the level of innovativeness, thereby contributing to more innovative final design outcomes. Key Words: decision support tool, Bayesian network model, engineering design education INTRODUCTION To capture and retain market share in the modern business environment, today’s organizations must meet or exceed customer expectations through product innovation (Shen et al., 2000). Innovation includes the introduction of new materials, components, and/or manufacturing pro- cesses; design changes employed to reduce manufacturing costs; and new applications of existing technology (Schumpeter, 1934). SUMMER 2015 1 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects Solid knowledge of innovative design is critical for successful, professional contribution within the workforce; therefore, participation in student design projects such as senior capstone experi- ences is an integral part of the undergraduate engineering curriculum. These activities are invalu- able learning experiences and equip future engineering professionals with the ability to design new products and services with innovative features in rapidly changing and highly competitive markets. Thus, instructors who teach innovative design techniques would benefit from a decision sup- port tool for guiding teams to carry out those activities and processes that best contribute to the production of innovative products. To this end, our paper describes the development of such a tool that allows engineering design instructors to 1) predict the innovativeness of a design artifact given the team’s activities as well as 2) provide guidance on activities that would contribute to a more innovative design outcome. This tool, implemented as a Bayesian network model, was developed based on empirical process-level activity data collected from undergraduate teams that produced both innovative and non-innovative capstone design products. Specifically, the activities of 26 senior bioengineering design teams were journaled using a secured, online survey system over a 23 to 24 week period as they progressed from an initial design concept to a working prototype. Subsequently, using this process-level activity data and Dym’s design process framework, three separate Bayesian network models for the early, middle, and late phases of the design process were built using the GeNie decision modeling software. Bayesian networks allow for determining the probability that an upstream event or activity occurred given that a downstream event, such as an innovative outcome on a capstone design project, also occurred (Genie Documentation, 2014). Thus, with a Bayesian network and ultimately Bayes Theorem, on which this network is based, we can determine the probability that a certain design process activity was performed to a certain degree in a certain project phase given that the design was ultimately rated as “innovative” or “non-innovative.” This type of analysis can lead to data-driven recommendations from instructors to students. Our work began as a doctoral dissertation, with subsequent refinement and analysis by the additional authors (Ozaltin, 2012). LITERATURE REVIEW AND DESIGN FRAMEWORK Engineering Design and Design Frameworks Design is a central and distinguishing engineering activity (Simon, 1996). It is a complex process versus a single isolated action that has a collectivist nature (Okudan & Mohammed, 2006). There is also no universally agreed-upon definition of design (Hyman, 2003). The literature has many sim- plified step-by-step models and frameworks of the design process based on the ABET definition, 2 SUMMER 2015 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects which presents the aspects of an ideal engineering design process (ABET Criteria, 2013). Hyman breaks the design process into the following nine steps: recognizing the need, defining the problem, planning the project, gathering information, conceptualizing alternative approaches, evaluating the alternatives, selecting the preferred alternative, communicating the design, and implementing the preferred design (Hyman, 2003). Another framework proposed by Pugh is based on a design core including the market, specification, concept design, detailed design, manufacture, and marketing (Pugh, 1990). Atman et al. compare freshman and senior engineering design processes and describe the design steps as follows: identification of the need, problem definition, gathering of information, generation of ideas, modeling, feasibility analysis, evaluation, decision, and communication (Atman et al., 1999). Dominick et al. define engineering design as an iterative process, and they segment it into the following four main phases: defining the problem, formulating solutions, developing mod- els and prototypes, and presenting and implementing the design (Dominick et al., 2001). Several other engineering design models and frameworks can be found in the literature (Pahl et al., 2007; Lewis & Samuel, 1989; French, 2010; Cross, 2001), although none of these models or frameworks is universally accepted by the engineering community (Hyman, 2003). We chose Dym’s engineering design framework, shown in Figure 1, as our theoretical model to generalize and thereby simplify Figure 1. Feedback and Iteration in Dym’s Design Process. SUMMER 2015 3 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects the large number of design activities used by our students (Dym & Little, 2004). We selected this model for two reasons. First, Dym et al.’s definition of engineering design matched the goals of the senior capstone project well. Their definition is as follows: “Engineering design is a systematic, intelligent process in which designers generate, evaluate and specify concepts for devices, systems, or processes whose form and function achieve clients’ objectives or users’ needs while satisfying a specified set of constraints” (Dym & Brown, 2012, p. 16; Dym et al., 2005, p. 104). In addition, Dym’s framework is flexible enough to be applied in different fields of engineering, including systems engineering, but still detailed enough to model important design activities, including iterations. Dym’s model was also preferable to other design process models in the literature because it aligned with both the engineering design and engineering education disciplines and was well-suited to the data collected. Since our research was focused on both design and product realization, we expanded Dym’s model by adding the marketing and management categories, since many prod- uct realization projects incorporate these activities. Hence, our overall process was described by eight categories (i.e., the original six categories in Dym’s design process model and two product realization categories). For this research, we adapted 89 design activities for collecting design process data throughout an academic year (Golish et al., 2008). These activities can be organized into engineering design stages based on the literature, such as opportunity identification, design and development, testing and preproduction, and introduction and production. In studies of design, it is common to gener- alize design activities into a smaller set of categories and/or cognitive operations. For example, the categories of exploration, generation, comparison, and selection have been used (Stempfle & Badke-Schaub, 2002). The challenge with this approach is its dependency on the information about the design activities and the fact that these activities often occur in cycles or iterations (Ha & Porteus, 1995; Krishnan et al., 1997). Bayesian Networks and Their Applications Within the broad area of engineering design, our research focused specifically on creating a tool to influence innovative design outcomes. We ultimately used a Bayesian network (BN) to develop a decision support tool to increase the likelihood of innovative outcomes in design settings. A Bayes- ian network is a probabilistic graphical model that represents a set of variables as circular nodes and their conditional dependencies or interactions as arcs or arrows. A BN allows for forward and backward inference under uncertainty given known evidence and is useful for analyzing “what-if” scenarios, even those that are not observed in practice (Jensen & Nielsen, 2007; Yannou et al., 2013; Genie Documentation, 2014). We used the GeNie software to create our Bayesian network model (Genie Documentation, 2014). This software provides a development environment for building 4 SUMMER 2015 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects graphical decision models. Although Bayesian networks are formulated using only chance nodes, the “set evidence” property of GeNie allows a chance node to be treated as a “decision node” by setting the evidence to a chosen state. Although the most popular application area for Bayesian networks is medical decision making, especially verification of a diagnosis, they have a wide range of applications in finance (e.g., market analysis), reliability (e.g., processor fault diagnosis), and defense (e.g., automatic target recognition). Bayesian networks have also been applied to engineering design problems, including improvement of the early design stage by addressing uncertainties in component characteristics and compatibil- ity (Moullec et al., 2013). This model also contributed to innovation by ensuring product feasibility and reducing the design risk. They focused on the early design stage (i.e., conceptual design) and determined the probabilities based on expert opinion. In contrast, our BN model encompasses the stages of conceptual design through prototype development and was built using actual design-team data to estimate the probabilities. Another application of BN’s to engineering design was an evaluation of innovation by considering industrial contexts (Yannou et al., 2013). These authors performed an empirical study to identify the factors related to design and analyzed the influence of these factors on the quality of the problem setting and subsequently the problem-solving process as well as the quality of the innovative proj- ect outcome. In comparison, our model offers suggestions for the utilization levels of the design activities that may lead to more innovative design outcomes. METHODS Data Collection The data used for developing our decision support tool was collected from bioengineering senior capstone design teams during the 2007-08 and 2008-09 academic years. Twenty-six teams partici- pated, with 18 teams from an engineering school in the Mid-Atlantic region and eight teams from an engineering school in the Midwest. The design projects were similar in nature, in which all students had to design a biomedical product or device. Where possible, we minimized variability between the two institutions. The number of students per team varied from three to five, and the students were paid for their participation in the study. Students were surveyed twice per week through a secure online system to collect quantifiable data about their design activities. Within each of four design stages, a student could select up to three activities he/she had worked on. This number was arbitrarily set but was believed to be sufficient given the three to four day interval between the surveys. If the student had not worked on the project since the last survey, he/she could select SUMMER 2015 5 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects “I have not worked on the design.” Each student completed the survey up to a total of either 45 or 48 times, depending on his/her school. As discussed, the entire set of activities was based upon the work of Golish et al. and was further refined by the capstone instructors (Golish et al., 2008). Students were trained in the meaning of the activities and were provided with a definition sheet for easy reference. It was assumed that students selected the activities and answered the open-ended questions in good faith. It is our belief for multiple reasons that students were honest in providing data. During their initial training session, students were informed that their answers would not be shared with the instructors and would not affect their grades. We utilized the training session and the assur- ances we gave to students at the time as a means to alleviate the Hawthorne effect (McBride, 2010). Students had the option to select “I have not worked on the design,” which was chosen 129 times during the project timeframe. Further, while reviewing the data, students appeared to be selecting logical activities and writing detailed reflections. Their responses did not appear to be cursory in any manner. Although each design was graded according to the instructors’ course criteria, each instructor also evaluated the projects using a common rating scale consisting of five criteria, including one for assessing the innovativeness of the final product. The five criteria used to assess the design projects were technical performance and standards, documentation, innovation, working prototype, and overall impact on the market or to the client. Each criterion also contained sub-criteria derived from the literature that the instructors collectively agreed upon. The innovation sub-criteria were based upon Schumpeter’s definition of innovation; therefore, the innovativeness of the products was evaluated based on 1) new applications of existing technology, 2) use of new materials or com- ponents, 3) introduction of new manufacturing processes, and 4) design changes that reduce costs. The scoring rubric was derived from the VentureWell (formerly National Collegiate Inventors and Innovators Alliance) BMEIdea Competition (BMEidea, 2014). Using this as a starting point, the re- searchers and instructors of the bioengineering capstone courses iteratively revised the rating scale to arrive at an agreed-upon set of defined attributes. Unfortunately, each instructor rated his/her teams’ products and reports separately, and this is a limitation of our work. In terms of qualifications, three of the instructors were biomedical engineering faculty, and one instructor was a bioengineer- ing faculty member as well as a co-founder of an innovative medical products company. This faculty member also has over ten medical-related patents. In addition to these qualifications for assessing the innovativeness of bioengineering projects, the literature supports the validity and consistency of faculty ratings on student and team performance when developed in an iterative manner (Stiggins, 1999; Moskal & Leydens, 2000; Callahan et al., 2000). To determine whether our results were reflective of “innovation” versus what might constitute “good design,” the correlation of innovation with each of the other criteria was calculated. With the 6 SUMMER 2015 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects documentation criterion, there was little variation, as all teams had similar scores; thus, documentation was not considered. The correlations of innovation with the other three criteria were between 0.42 and 0.52; thus, innovation was at best moderately correlated with the other components of design. These three correlations were significantly different from zero at α=0.05 (Ozaltin et al., 2015). The rubric scores ranged from “1” (poor) to “5” (excellent). For this research, products having a score of “4” or “5” on the innovation criteria were considered innovative; and conversely, products having scores of “1” or “2” were considered non-innovative. Overall, there were eight innovative and eight non-innovative products from the 26 teams, with 10 products that were rated as neither innovative nor non-innovative. Model Development The 89 possible activities used by the students were classified into the eight categories of our theoretical model by an experienced research team consisting of five individuals in the field of design and product realization. The research team members first individually and then collectively arranged all activities according to Dym’s model. Discrepancies between members were discussed to determine the best fit of the activities to the categories. In some cases, it was determined that certain activities could be placed in multiple categories. The element of time was an important aspect of this research. To obtain stronger and more spe- cific results, the project timeline was divided into three phases - early, middle, and late. Given the relatively long time frame of the students’ design process (i.e., multiple terms), three separate BN models representing the early, middle and late phases were created. As seen in Figure 2, a five-day transition period was used between the phases to prevent rigid borders. A partial membership rule was applied for those activities observed in the transition period (Ozaltin et al., 2015). The early and late phases had four time epochs each, and the middle phase contained five. A time epoch was representative of approximately two to two and a half weeks of design activity. Figure 2. Timeline of the Design Process. SUMMER 2015 7 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects We considered several modeling approaches for developing our tool. A decision-based ap- proach was required, as engineering design teams continuously make decisions throughout the process (Lewis et al., 2006). The model had to consider any history, since past design activities are critical in determining current and future activities. Also, although activity selection and us- age influence innovativeness, they do not guarantee an innovative output. Thus, the proposed model had to allow for uncertainty. Lastly, given the data collected, there were no intermediate rewards but only a final reward (i.e., the final prototype score). Markov chains (MC), Markov deci- sion processes (MDP), and influence diagrams (ID) were each evaluated as candidate modeling techniques. Influence diagrams supported all of the requirements, including a decision-based approach, maintenance of history, consideration of uncertainty, and accommodation of an end reward. In particular, a Bayesian network, a special case of an influence diagram, was chosen as the modeling technique. Our Bayesian network depicts how design teams used Dym’s design process across time to achieve a design artifact, albeit innovative or non-innovative. Using Dym’s extended model, our BN contains eight variables, as shown in Figure 3, where each variable, or design category, is represented by a node in the model. Each node has three states representing the level of usage by a design team: low, medium, or high. The variables are repeated across the time epochs t. Figure 3. Bayesian Network Variables. 8 SUMMER 2015 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects Figure 4. Bayesian network model. Figure 4 shows the various nodes of the early phase model, which appear in all four time epochs. The final output of the model is a node consisting of two states: innovative vs. non-innovative. Given the large number of nodes in the model, there were a very large number of potential relationships or dependencies (i.e., arcs) among the nodes. We determined that the majority of all possible relationships, as quantified through conditional probabilities, did not occur and were therefore not represented in our data. In addition, when all possible relationships were represented, the model became overly complicated and unwieldy. Therefore, to simplify as well as enhance the model, we alternatively added innovation inter-nodes with two states after each time epoch to efficiently connect the epochs, as shown in Figure 4. Unfortunately, when evidence is entered into a subsequent epoch, the effect of the previous time epochs is virtually lost. This meant, for example, that prior to the addition of the inter-nodes, evi- dence set in the first and fourth time epochs did not have the same impact on the final innovation, or output, node of the phase. Further, as evidence at t+1 was entered, the innovation node at t was also updated, since evidence for a node impacts all ancestor nodes, including innovation nodes; hence, this situation potentially impacted the final output. To remedy this problem, innovation nodes containing no descendants (i.e., having no downstream nodes) were created to better preserve information about the prior time epoch, as shown in Figure 4. These new innovation nodes with no descendants were connected directly to the final output node and were collectively weighted to determine the ultimate output. Our model assumes that any given product is either innovative or non-innovative, based on the team data that was used to build the model. Another assumption of our model, which was necessary from a model simplification standpoint and will be discussed further, is that activities utilized within SUMMER 2015 9 ADVANCES IN ENGINEERING EDUCATION An Engineering Educator’s Decision Support Tool for Improving Innovation in Student Design Projects time epoch t are independent of each other; however, they are dependent on the activities utilized in the prior epoch t-1. All three phases were modeled similarly; therefore, only the early phase is il- lustrated here. The middle and late-phase models each contained one additional node. This leftmost node in the middle-phase model carried information from the early phase (i.e., innovativeness of the early phase). Likewise, the leftmost node in the late-phase model carried information from the middle phase (i.e., innovativeness of the middle phase). A clustering algorithm was applied to determine empirically-based usage levels for the design categories. Both two-step and K-means clustering algorithms were applied to the activity data, which consisted of the number of occurrences of each design activity by time epoch and team. Since the two-step algorithm yielded more balanced results, it was ultimately selected to perform the clustering. For simplicity, the chosen number of clusters for each design category was three, corresponding to low, medium, and high usage levels. We desired a small number of categories so that instructors could effectively communicate to their students any changes they should be making to enhance their innovativeness. For example, an instructor might tell a particular team, “You may want to consider doing design communication at a low level at this point and instead invest more time in detailed design by doing it at a high level.” Based on the clustering as shown in Table 1, if the number of occurrences for problem definition for all team members was less than or equal to three, the team’s usage level for problem definition was identified as low. If it was between four and eleven, the usage was medium; otherwise its state was high. The clustering for problem definition resulted in a cluster solution quality of 0.82, as defined by the silhouette coefficient in the SPSS soft- ware. SPSS identifies cluster solution quality as ranging from poor to good, with “good” extending from 0.50 to 1.00 (Norusis, 2011). Clustering results for the remainder of the design categories are shown in Table 1. Since the category usage levels were determined based on actual student data, Usage Level (Occurrences) Design Activity Low Medium High Cluster Solution Quality Problem Definition 0–3 4–11 12 or more 0.82 Conceptual Design 0–1 2–5 6 or more 0.80 Preliminary Design 0–2 3–7 8 or more 0.67 Detailed Design 0–5 6–13 14 or more 0.77 Design Communication 0–2 3–8 9 or more 0.71 Review 0–6 7–17 18 or more 0.70 Management 0–2 3–6 7 or more 0.72 Marketing 0 1 2 or more 0.99 Table 1. Usage Level Definitions for the Design Activities 10 SUMMER 2015

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.