Masters Degrees (Industrial Engineering)
Permanent URI for this collection
Browse
Browsing Masters Degrees (Industrial Engineering) by browse.metadata.advisor "Bekker, James"
Now showing 1 - 19 of 19
Results Per Page
Sort Options
- ItemA data and modelling framework for strategic supply chain decision-making in the petro-chemical industry.(Stellenbosch : Stellenbosch University, 2006-12) Van Schalkwyk, Willem Tobias; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The research was initiated by an opportunity within the petro-chemical company Sasol to explore, improve and integrate various analytical techniques used in the modelling, design and optimisation of supply chains. Although there is already a strong focus on the use of analytical applications in this environment, the lack of both modelling integration and analytical data availability has led to less than optimal results. This document presents an exploration into the supply chain planning landscape, and in particular strategic planning in the petro-chemical environment. Various modelling methodologies and techniques that support strategic supply chain decision-making are identified, followed by an in-depth analysis of the data requirements for effectively constructing each of these models. Perhaps the biggest hurdle in the continual use of modelling techniques that support strategic supply chain decision-making, remains the extent of the data gathering phase in any such project. Supply chain models are usually developed on an ad hoc project basis, each time requiring extensive data gathering and analysis from transactional data systems. The reason for this is twofold: 1) transactional data are not configured to meet the analytical data requirements of supply chain models, and 2) projects are often done in isolation, resulting in supply chain data that end up in spreadsheets and point solutions. This research proposes an integrated data and modelling framework, that aspires to the sustainable use of supply chain data, and continual use of modelling techniques to support strategic supply chain decision-making. The intent of the framework is twofold: 1) to enable the design of new supply chains, and 2) to ensure a structured approach for capturing historical supply chain activities for continued review and optimisation. At the heart of the framework is the supply chain analytical data repository (SCADR), a database that maintains supply chain structural and managerial information in a controlled data model. The motivation behind developing a database structure for storing supply chain data is that a standard encoding method encourages data sharing among different modelling applications and analysts. In the globalised environment of the 21•t century, companies can no longer ensure its market position solely by its own functional excellence ... in the new economy, whole business ecosystems compete against each other for global survival (Moore, 1996). This motivates the ever-increasing importance of supply chain management, which necessitates the use of advanced analytical tools to assist business leaders in making ever more complex supply chain decisions. It is believed that the integration of information requirements for multiple optimisation/ modelling initiatives in a structured framework (as presented in this research) will enable sustainability and improved strategic decision-making for the petro-chemical supply chain.
- ItemAdding multi-objective optimisation capability to an electricity utility energy flow simulator(Stellenbosch : Stellenbosch University, 2016-03) Brits, Ryno Ockert; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The energy flow simulator (EFS) is a strategic decision support tool that was developed for the South African national electricity utility Eskom. The advanced set of algorithms incorporated into the EFS enables various departments within Eskom to simulate and analyse the Eskom value chain from primary energy to end-use over a certain study horizon. The research in this thesis is aimed at determining whether multi-objective optimisation (MOO) capability can be added to the EFS. The study forms part of a series of research projects. This project builds on the work of Hatton (2015) in which the focus was on single-objective optimisation capability for the EFS. Inventory management at Eskom's coal- red power stations was identified as the most suitable area for the formulation of an MOO model. It was also identified that certain modifications to the existing EFS architecture can possibly improve its potential as an optimisation tool. The architecture of the EFS is studied and modifications to it are proposed. A multi-objective inventory model is then formulated for Eskom's network of coal- red power stations using the simulation outputs of the EFS. The model is based on the movement of coal between the various power stations in an attempt to maintain an optimal inventory level at each station as far as possible. To solve the model, a suitable MOO algorithm is selected and integrated with the simulation component of the EFS. Several experiments are conducted to validate the MOO model and test the e effectiveness of the algorithm in solving the optimisation problem.
- ItemA decision support framework for machine learning applications(Stellenbosch : Stellenbosch University, 2020-20) Du Preez, Anli; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: Data is currently one of the most critical and influential emerging technolo-gies. Organisations and employers around the globe strive to investigateand exploit the exponential data growth to discover hidden insights in aneffort to create value. Value creation from data is made possible throughdata analytics (DA) and machine learning (ML). The true potential of datais yet to be exploited since, currently, about 1% of generated data is everactually analysed for value creation. There is a data gap. Data is availableand easy to capture; however, the information therein remains untappedyet ready for digital explorers to discover the hidden value in the data. Onemain factor contributing to this gap is the lack of expert knowledge in thefield of DA and ML.In a survey of 437 companies, 76% indicated an interest to invest in DA andML technologies over the years of 2015 to 2017. However, in a survey of 400companies, 4% indicated that they have the right strategic intent, skilledpeople, resources and data to gain meaningful insights from their data andto act on them. Small, medium and micro enterprises (SMMEs) lack theavailability of DA and ML skills in their existing workforce, have limitedinfrastructure to realise ML and have limited funding to employ ML toolsand expertise. They need proper guidance as to how to employ ML in alow-cost, feasible and sustainable way.This study focused on addressing this data gap by providing a decision sup-port framework for ML algorithms. The goal of this study was therefore todevelop and validate adecision support frameworkwhich considers both thedata characteristicsand theapplication typeto enable SMMEs to choosethe appropriate ML algorithm for their unique data and application pur-pose. This study aimed to develop the framework for a semi-skilled analyst,with mathematics, statistics and programming education, who is familiar with the process of programming, yet has not specialised in the variety ofML algorithms which are available.This research project followed the Soft Systems Methodology and utilisedJabareen’s framework development methodology. Various literature studieswere performed on data, DA, application purposes, ML and the processof applying ML. The CRoss-Industry Standard Process for Data Mining(CRISP-DM) was followed to design and implement the experiments. Theresults were evaluated and summarised to create the decision support frame-work. The framework was validated by consulting subject matter experts(SMEs) and possible end-users (PEUs).
- ItemDeveloping a dynamic mission-ready resource problem solution model(Stellenbosch : Stellenbosch University, 2020-03) Leuvennink, Bernard Cornelius; Bekker, James; Stellenbosch University. Faculty of Industrial Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The mission-ready resource allocation (MRRA) problem is well-known in Operations Research. It is a combinatorial problem which quickly assumes a large decision space as the problem is extended to practical situations. Efforts to solve the MRRA problem were aimed at given scenarios, and once a solution is found, the solution process was terminated.In this study,the question of a dynamic, stochastic MRRA (DSMRRA) problem is investigated. In this problem, a series of missions that are distributed over time require resources and the availability of resources vary. The objective is to find good solutions for all the missions; the allocation of a given mission will in influence the available resources of a subsequent mission. In this study, simulation and bi-objective optimisation were used to demonstrate that the DSMRRA problem can be solved. The static MRRA problem originated in the military, but in this study, it is demonstrated that the DSMRRA problem can also be applied to business operations.
- ItemDeveloping basic soccer skills using reinforcement learning for the RoboCup small size league(Stellenbosch : Stellenbosch University, 2015-03) Yoon, Moonyoung; Bekker, James; Kroon, R. S. (Steve); Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: This study has started as part of a research project at Stellenbosch University (SU) that aims at building a team of soccer-playing robots for the RoboCup Small Size League (SSL). In the RoboCup SSL the Decision- Making Module (DMM) plays an important role for it makes all decisions for the robots in the team. This research focuses on the development of some parts of the DMM for the team at SU. A literature study showed that the DMM is typically developed in a hierarchical structure where basic soccer skills form the fundamental building blocks and high-level team behaviours are implemented using these basic soccer skills. The literature study also revealed that strategies in the DMM are usually developed using a hand-coded approach in the RoboCup SSL domain, i.e., a specific and fixed strategy is coded, while in other leagues a Machine Learning (ML) approach, Reinforcement Learning (RL) in particular, is widely used. This led to the following research objective of this thesis, namely to develop basic soccer skills using RL for the RoboCup Small Size League. A second objective of this research is to develop a simulation environment to facilitate the development of the DMM. A high-level simulator was developed and validated as a result. The temporal-difference value iteration algorithm with state-value functions was used for RL, along with a Multi-Layer Perceptron (MLP) as a function approximator. Two types of important soccer skills, namely shooting skills and passing skills were developed using the RL and MLP combination. Nine experiments were conducted to develop and evaluate these skills in various playing situations. The results showed that the learning was very effective, as the learning agent executed the shooting and passing tasks satisfactorily, and further refinement is thus possible. In conclusion, RL combined with MLP was successfully applied in this research to develop two important basic soccer skills for robots in the RoboCup SSL. These form a solid foundation for the development of a complete DMM along with the simulation environment established in this research.
- ItemDevelopment and demonstration of a Customer Super-Profiling tool utilising data analytics for alternative targeting in marketing campaigns(Stellenbosch : Stellenbosch University, 2018-12) Walters, Marisa; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT:Being part of a competitive generation demands that a business has good marketing policies to attract new customers as well as to retain existing ones. Marketing managers can develop long-term and healthy relationships with customers, if they can detect and predict changes in their customers' purchasing behaviour. With the growth of information systems and technology, businesses have an increasing capability to accumulate huge quantities of customer data in large databases. However, much of these potentially useful marketing insights into customer characteristics and their purchasing patterns often remains hidden and untapped. Therefore, businesses can achieve competitive advantages by studying customer behaviour through data mining tools (i.e. supervised and unsupervised learning) and techniques (i.e.classification, regression and clustering). The goal of this research project was to develop a Customer Super-Profiling (CSP) tool that has the ability to analyse large (non-aggregate) customer datasets, considering both demographic and behavioural features. The data analytics was done by utilising more than one data mining tool, which generates customer super-profiles. These profiles are used to attract and classify new customers as well as to retain existing customers, providing the user with the ability to predict each customer's specific needs. This research project outlines a general methodology for segmentation of customers by using the model of Recency, Frequency and Monetary (RFM), together with k-means clustering (unsupervised learning) to identify the various types of customers within the dataset. Customer profiles are then generated, in the form of decision rules (supervised learning) to identify each type of customer as well as classifying them into the various clusters created. These predictions are performed based on the customers' demographic and behavioural features. The CSP tool was applied and demonstrated on large customer datasets from four different domains and useful results were found.
- ItemDevelopment of a Big Data analytics demonstrator(Stellenbosch : Stellenbosch University, 2018-12) Butler, Rhett Desmond; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The continued development of the information era has established the term `Big Data' and large datasets are now easily created and stored. Now humanity begins to understand the value of data, and more importantly, that valuable insights are captured within data. To uncover and convert these insights into value, various mathematical and statistical techniques are combined with powerful computing capabilities to perform analytics. This process is described by the term `data science'. Machine learning is part of data analytics and is based on some of the mathematical techniques available. The ability of the industrial engineer to integrate systems and incorporate new technological developments benefiting business makes it inevitable that the industrial engineering domain will also be involved in data analytics. The aim of this study was to develop a demonstrator so that the industrial engineering domain can learn from it and have first-hand knowledge in order to better understand a Big Data Analytics system. This study describes how the demonstrator as a system was developed, what practical obstacles were encountered as well as the techniques currently available to analyse large datasets for new insights. An architecture has been developed based on existing but somewhat limited literature and a hardware implementation has been done accordingly. For the purpose of this study, three computers were used: the first was configured as the master node and the other two as slave nodes. Software that coordinates and executes the analysis was identified and used to analyse various test datasets available in the public domain. The datasets are in different formats which require different machine learning techniques. These include, among others, regression under supervised learning, and k-means under unsupervised learning. The performance of this system is compared with a conventional analytics configuration, in which only one computer is used. The criteria used were 1) The time to analyse a dataset using a given technique and 2) the accuracy of the predictions made by the demonstrator and conventional system. The results were determined for several datasets, and it was found that smaller data sets were analysed faster by the conventional system, but it could not handle larger datasets. The demonstrator performed very well with larger datasets and all the machine learning techniques applied to it.
- ItemDevelopment of a peer-to-peer voucher donation management system enabled by blockchain(Stellenbosch : Stellenbosch University, 2023-02) Harraway, Tricia Margaret; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: In recent years charity and philanthropy sectors have experienced a surge in the availability of digital resources to aid in providing transparency in their operations and donation processing. There is no doubt that the world is transitioning into a digital space, where new solutions to existing problems could be approached with emerging technology. The Payments Association of South Africa (PASA) is aiming for a cashless society by 2030. Two frequently cited barriers to financial inclusion and bank account ownership is distrust in the system and lack of formal identification. As a result, FinTech companies are playing a larger role in providing innovative solutions for the undocumented and unbanked population, using technologies such as blockchain. Africa has a rich history of community philanthropy where horizontal giving is likely to be in the form of micro-donations such as cash, food or clothing. If fewer individuals rely on paper notes and coins, the concern is that those informal peer-to-peer donations will become less frequent and more challenging to make. Vouchers have been used in the past, in order to ensure donations are spent responsibly. In addition, e-vouchers provide higher levels of security, the possibility for different donation platforms to be digitally integrated and adapted, and as well as improved levels of engagement in the donation process. Programmable vouchers have all these benefits and also the potential to enable direct and customised donor-to-beneficiary charitable payments, without the need for trusted third party service providers. The potential for a digital, transparent peer-to-peer mechanism to donate vouchers in the form of blockchain tokens is presented. Blockchain technology enables distributed networks of trust, where instead of relying on a central entity, trust is shared among peers. Furthermore, the level of transparency presented by blockchain enables traceability of the movements of the donations. By disintermediating the process of supporting peers, providing transparency in resource allocation and increasing transaction efficiency, blockchain provides a use-case for decentralised, horizontal and direct ‘giving’. This research project explores the development of a digital voucher donation mechanism where the shortcomings of traditional voucher management systems, such as requiring a bank account or mobile device or needing formal identification on request, are challenged. To address these challenges and the need for financial inclusion, existing voucher transfer architectures and humanitarian information systems were consulted and a demonstrator was developed. A decentralised application with a front-end is designed and developed to interface with the smart contracts that hold the logic of the voucher donation management system. Vouchers are minted as tokens on the Polygon network to present scalability in the future. Traditionally the voucher transaction information was stored in a centralised database, instead they are transparently recorded on a distributed network. The blockchain acts as the back-end supporting the smart contract implementation which is deployed with predefined conditions mirroring traditional escrow services. Digital wallets are integrated, allowing beneficiaries to store and redeem their voucher tokens at participating vendors via a QR code public address. The smart contracts are verified and validated using various software tests, to ensure that their functionality fulfills the requirements of the research project. The evaluation of the demonstrator showed that blockchain technology, while still in its infancy, has the potential to empower the underserved, while providing full transparency for the stakeholders involved.
- ItemE-commerce web site evaluation : developing a framework and method for the systematic evaluation of e-commerce web sites and using correspondence analysis to represent the results graphically per industry(Stellenbosch : Stellenbosch University, 2001-12) Van der Merwe, Rian; Bekker, James; Pitt, Leyland; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The corporate web site is essential to companies who use the Internet for e-commerce purposes. For these companies, the web site is the platform used to communicate with customers and facilitate business transactions. Internet companies will not be able to do business successfully with an ineffective web site, because this implies that the only contact point that the company has with customers is not functioning properly. It is, however, extremely difficult to identify what an effective e-commerce web site constitutes of. A great need therefore exists for a comprehensive and accurate method to evaluate the performance of the web sites of Internet companies, not only individually but also in comparison with the web sites of other companies in the same industry. Managers of Internet companies would certainly like to know how their web sites perform, what they can do to increase their performance, and which web sites in their industry can be used as a benchmark in certain areas. This thesis aims to address these needs by fulfilling three objectives: ► To develop a framework and criteria for the comprehensive evaluation of e-commerce web sites. ► To use this framework and sound statistical reasoning to develop a method that can be used to evaluate e-commerce web sites quantitively, and represent the results graphically per industry. ► To implement this method by developing computer software that enables users to evaluate web sites and plot the results. To accomplish these objectives, the following methodology was followed: ► Review the research done in the field of web site evaluation for both general and e-commerce web sites. ► Review the research on different techniques in the field of Multidimensional Scaling, and identify an appropriate technique for developing two-dimensional plots of web site evaluation data. ► Expand the web site evaluation research and develop a framework and objective criteria for the evaluation of e-commerce web sites, based on solid business principles. ► Develop a method to gather web site evaluation data that is grouped within industries, and to represent the results graphically using an appropriate Multidimensional Scaling technique. ► Implement the method by developing computer software to automate the process. This document describes the course of the methodology in detail. It reports on the e-commerce web site evaluation framework that was developed; Correspondence Analysis as the Multidimensional Scaling technique used to analyse the evaluation data; the development of the e-commerce web site evaluation method; and the software that was developed in Microsoft Visual Basic to implement the evaluation method. All three objectives were fulfilled in this thesis, in spite of some concerns that are also discussed. The evaluation framework and accompanying software can be used to evaluate all aspects of e-commerce web sites, and the output can be used to draw meaningful conclusions about how these sites can be improved.
- ItemIdentity theft risk quantification for social media users(Stellenbosch : Stellenbosch University, 2017-03) Michau, Nicola; Bekker, James; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The information era has made it di cult to protect and secure one's personal information. One such struggle is that of identity theft, a crime that has caused great su ering to its victims. O enders guilty of the crime use the identities of their victims for the purpose of entertainment or fraud. Social media has extended the capability of people to interact and share information, but without the appropriate guidelines to protect individuals from becoming victims of identity theft. There is a lack of studies on identity theft and its determinants. The purpose of the research is therefore to assist with the prevention of identity theft by determining the effect that information-sharing on social media has on the risk of individuals becoming identity theft victims. The details of reported identity theft victims were collected from the South African Fraud Prevention Services. Data on individuals' information-sharing habits on social media networks, like Facebook and LinkedIn, was collected via surveys that were sent to a relevant group at the Stellenbosch University. It was found that the two variables, Age and Gender, were the greatest predictors of identity theft victims. A prediction model was developed that serves as a tool to score individuals as high-risk or low-risk victims according to their attributes and social media information-sharing habits. The findings benefit research on the prevention of identity theft, by raising awareness of the potential risks the sharing of sensitive data on social media has.
- ItemImprovement of airline check-in and boarding processes(Stellenbosch : Stellenbosch University, 2022-04) Corlett, Bradley Michael; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: Since its inception in 1903, the aviation industry has become one of the biggest and most important sectors in the service industry worldwide. Airlines are constantly trying to find ways to improve customers’ experience while remaining profitable. Best practices exist within airline communities; however, many airlines conduct their key processes in different ways from that of their competitors. The airline check–in and boarding processes were identified to have significant improvement opportunities and were, therefore, the two processes focused on in this study. Aircraft turnaround time is a major focus and key performance indicator for airlines in South Africa. Although many elements affect this turnaround time, the process of boarding passengers is often the biggest bottleneck. Varying the boarding method can improve the flow of passengers through the cabin, which often leads to a shorter boarding time. Many international airlines use a zonal boarding system, which entails boarding passengers in separate groups. This study explains alternative ways to board aircrafts and continues to present real–world trials that took place at Cape Town International Airport. A new boarding method was identified to increase the average number of passengers boarded per minute and shows great potential to improve aircraft turnaround time. One of the many observations made during the trials, that affects the efficiency of the boarding process, was that passengers tend to carry excessive hand luggage. Simulation and business process modelling tools were used to analyse both the as–is and modified airline check–in process. A simulation model was built to experiment with varia tions in the allocation of check–in counters for an identified industry partner. This model was verified and validated before running preliminary and final simulation experiments. Key performance measures, including the amount of time a passenger spends in the check– in system, were estimated and used to determine the industry partner’s best–suited counter configuration. These simulation study results also provide evidence for the differences in target performance measures, should the industry partner want to improve the system further. Self–service check–in kiosks were identified to be underutilised in South African airports and would only benefit passengers should additional functionality be added. The varying effects of passengers using online check–in were also explored in this study, as the differences between using the regular check–in and bag drop counters proved to be minimal. With the help of a systematic approach and relevant software–based tools, many more airline processes can be analysed to identify potential improvements. This study supports the notion of industrial engineering being a field that has spread into many industries around the world.
- ItemIntegration of ranking and selection methods with the multi-objective optimisation cross-entropy method(Stellenbosch : Stellenbosch University, 2015-03) Von Lorne von Saint Ange, Chantel; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: A method for multi-objective optimisation using the cross-entropy method (MOO CEM) was recently developed by Bekker & Aldrich (2010) and Bekker (2012). The method aims to identify the nondominated solutions of multi-objective problems, which are often dynamic and stochastic. The method does not use a statistical ranking and selection technique to account for the stochastic nature of the problems it solves. The research in this thesis aims to investigate possible techniques that can be incorporated into the MOO CEM. The cross-entropy method for single-objective optimisation is studied first. It is applied to an interesting problem in the soil sciences and water management domain. The purpose of this was for the researcher to grasp the fundamentals of the cross-entropy method, which will be needed later in the study. The second part of the study documents an overview of multi-objective ranking and selection methods found in literature. The first method covered is the multi-objective optimal computing budget allocation algorithm. The second method extends upon the first to include the concept of an indifference-zone. Both methods aim to maximise the probability of correctly selecting the non-dominated scenarios, while intelligently allocating simulation replications to minimise required sample sizes. These techniques are applied to two problems that are represented by simulation models, namely the buffer allocation problem and a classic single-commodity inventory problem. Performance is measured using the hyperarea indicator and Mann-Whitney U-tests. It was found that the two techniques have significantly different performances, although this could be due to the different number of solutions in the Pareto set. In the third part of the document, the aforementioned multi-objective ranking and selection techniques are incorporated into the MOO CEM. Once again, the buffer allocation problem and the inventory problem were chosen as test problems. The results were compared to experiments where the MOO CEM without ranking and selection was used. Results show that the MOO CEM with ranking and selection has various affects on different problems. Investigating the possibility of incorporating ranking and selection differently in the MOO CEM is recommended as future research. Additionally, the combined algorithm should be tested on more stochastic problems.
- ItemA location science model for the placement of POC CD4 testing devices as part of South Africa's public healthcare diagnostic service delivery model(Stellenbosch : Stellenbosch University, 2015-03) Oosthuizen, Louzanne; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: South Africa has a severe HIV (human immunodeficiency virus) burden and the management of the disease is a priority, especially in the public healthcare sector. One element of managing the disease, is determining when to initiate an HIV positive individual onto anti-retroviral therapy (ART), a treatment that the patient will remain on for the remainder of their lifetime. For the majority of HIV positive individuals in the country, this decision is governed by the results of a CD4 (cluster of differentiation 4) test that is performed at set time intervals from the time that the patient is diagnosed with HIV until the patient is initiated onto ART. A device for CD4 measurement at the point of care (POC), the Alere PIMA™, has recently become commercially available. This has prompted a need to evaluate whether CD4 testing at the POC (i.e. at the patient serving healthcare facility) should be incorporated into the South African public healthcare sector's HIV diagnostic service provision model. One challenge associated with the management of HIV in the country is the relatively large percentage of patients that are lost to follow-up at various points in the HIV treatment process. There is extensive evidence that testing CD4 levels at the POC (rather than in a laboratory, as is the current practice) reduces the percentage of patients that are lost to follow-up before being initiated onto ART. Therefore, though POC CD4 testing is more expensive than laboratory-based CD4 testing, the use of this technology in South Africa should be investigated for its potential to positively influence health outcomes. In this research, a multi-objective location science model is used to generate scenarios for the provision of CD4 testing capability. For each scenario, CD4 testing provision at 3 279 ART initiation facilities is considered. For each facility, either (i) a POC device is placed at the site; or (ii) the site's testing workload is referred to one of the 61 CD4 laboratories in the country. To develop this model, the characteristics of eight basic facility location models are compared to the attributes of the real-world problem in order to select the most suitable one for application. The selected model's objective, assumptions and inputs are adjusted in order to adequately model the realworld problem. The model is solved using the cross-entropy method for multi-objective optimisation and the results are verified using a commercial algorithm. Nine scenarios are selected from the acquired Pareto set for detailed presentation. In addition, details on the status quo as well as a scenario where POC testing is used as widely as possible are also presented. These scenarios are selected to provide decision-makers with information on the range of options that should be considered, from no or very limited use to widespread use of POC testing. Arguably the most valuable contribution of this research is to provide an indication of the optimal trade-off points between an improved healthcare outcome due to POC CD4 testing and increased healthcare spending on POC CD4 testing in the South African public healthcare context. This research also contributes to the location science literature and the metaheuristic literature.
- ItemOptimising the passage through charted minefields by path planning and mine removal(Stellenbosch : Stellenbosch University, 2006-04) Schmid, Jorg Peter; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: Shipping is the lifeline to maritime nations. Therefore it is essential that approaches to harbours and other strategic areas are kept free of threats by sea mines. With the technological possibility of remotely surveying threatening sea minefields, it has become necessary to develop a method by which such a charted minefield can be transited with least risk to shipping. To achieve this, two areas of interest have to be addressed and the resulting questions solved. This thesis addresses that requirement by meeting the following objectives: - to propose a methodology by which the risk involved in transiting a minefield can be managed so that paths of acceptable risks can be taken through a minefield; - if acceptable paths do not exist, to develop a methodology by which the minimum number of mines can be identified for removal so that a sufficiently safe path is established. These objectives were met by following the approach outlined below: - defining the problem in the context of traditional and developing mine warfare and mine countermeasures; - clearly stating the problems that have to be solved; - investigating the enablers available to solve the problems; - selecting and motivating a suitable approach; - describing the background knowledge to the proposed solutions; - implementing the solutions in a useable computer application; - investigating the parameters that pertain to the solution and presenting the findings; - drawing conclusions from the results and insights obtained from exposure to the problem and the solution strategies. The presented methodology uniquely combines two methods of combinatorial optimisation to give an integrated solution to the two stated problems of quantifying a risk methodology and removing required mines. The methods use the well known shortest path algorithm, Dijkstra's Algorithm, and a Genetic Algorithm for the basis of the proposed solution. Also, elements of the principle of the Efficient Frontier Graph are integrated to illustrate the aspects of return versus risk. The solution to finding a safe path through a charted minefield is approached from two risk principles: - Finding a path that is optimised for minimum risk over the entire length of the path. Here risk is a function of the distance between the mine and the ship. - Defining a maximum allowable risk and minimising the path length. Here risk is translated into the closest distance that a ship is allowed to approach a mine with the areas closer than that, being declared out of bounds. The sea mine removal problem is solved primarily by using a Genetic Algorithm that bases the quality of a solution on a parameter obtained by applying the methodology developed for solving the problem of optimising a path. This is achieved by minimising the number of sea mines to be removed to create a safe path.
- ItemPredicting the next purchase date for an individual customer using machine learning(Stellenbosch : Stellenbosch University, 2020-12) Droomer, Marli; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: We live in a world that is rapidly changing when it comes to technology. Gatheringa customer’s information becomes easier as companies have loyalty programs thattrack the customer’s purchasing behaviour. We live in an era where search enginessuggest your next word, online shopping is no longer scary, and people order aride by means of an application. The fact is that technology is evolving, andgathering information from customers is becoming easier. Given this change,the questions, however, are: How do companies use this information to gain acompetitive advantage? Do they use this information to benefit the customer?How can a company use customer information to give each individual a uniqueexperience?A research study was conducted to determine if an individual customer’s nextpurchase date for specific products can be predicted by means of machine learning.The focus was on fast-moving consumer goods in retail. This next purchase date canthen be used to individualise marketing to customers, which benefits the companyand the customer. In this study, the customer’s purchase history is used to train AbstractWe live in a world that is rapidly changing when it comes to technology. Gatheringa customer’s information becomes easier as companies have loyalty programs thattrack the customer’s purchasing behaviour. We live in an era where search enginessuggest your next word, online shopping is no longer scary, and people order aride by means of an application. The fact is that technology is evolving, andgathering information from customers is becoming easier. Given this change,the questions, however, are: How do companies use this information to gain acompetitive advantage? Do they use this information to benefit the customer?How can a company use customer information to give each individual a uniqueexperience?A research study was conducted to determine if an individual customer’s nextpurchase date for specific products can be predicted by means of machine learning.The focus was on fast-moving consumer goods in retail. This next purchase date canthen be used to individualise marketing to customers, which benefits the companyand the customer. In this study, the customer’s purchase history is used to trainmachine learning models. These models are then used to predict the next purchasedate for a customer-product pair. The different machine learning models that areused are recurrent neural networks, linear regression, extreme gradient boostingand an artificial neural network. Combination approaches are also investigated, andthe models are compared by the absolute error, in days, that the model predictsfrom the target variable.The artificial neural network model performed the best, predicting 31.8% of thedataset with an absolute error of less than one day, and 55% of the dataset withan absolute error of less than three days. The application of the artificial neuralnetwork as the Next Purchase Date Predictor is also demonstrated and shows howindividualised marketing can be done using the Next Purchase Date Predictor.The encouraging results of the Next Purchase Date Predictor showed that machinelearning could be used to predict the next purchase date for an individual customer.
- ItemProcess flexibility in a vaccine manufacturing system under high demand uncertainty(Stellenbosch : Stellenbosch University, 2023-03) Spamer, Marike; Bam, Louzanne; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: The Covid-19 disease was first diagnosed in December 2019 in Wuhan, China. The transmission of the disease occurred at a rapid pace causing disruption to the world’s health systems and the global economy. This created an incentive to quickly develop an effective vaccine against Covid-19. To manufacture the vaccine products and meet the global demand, requires large scale manufacturing capacity. Several vaccine platforms, based on different antigen production systems, have been employed in the search for effective vaccine products. Due to the rapid pace at which the development of vaccine products are occurring, great uncertainty is associated with which platforms will receive regulatory approval and the timeline in which this will occur. This provided a challenge for the Covid-19 vaccine manufacturing system in terms of manufacturing capacity planning. This study investigated the impact that process flexibility can have in reducing the negative impact of the high demand uncertainty associated with vaccine approval for the Covid-19 vaccine manufacturing system. A discrete-event simulation model was developed in Tecnomatix Plant Simulation to investigate this. The model was verified via a series of model execution tests and the results were used to correct errors in the model. Further, the model was validated by conducting semi-structured interviews with subject matter experts in the fields of vaccine development and manufacturing. The feedback from the interviews informed improvements to the model. It was uncovered in this study that process flexibility significantly improves the performance, in terms of throughput, for a manufacturing system with high demand uncertainty when either the long chain or full flexibility configuration is incorporated (the throughput improved between 25% and 119%). The throughput performance for the full flexibility configuration is markedly better than the long chain configuration. The capital costs associated with the full flexibility configuration is often, however, viewed as an unjustifiable expense. Process flexibility investment decisions should thus also consider the capital costs associated with process flexibility configurations. It was observed that the operating cost per dose for stainless-steel equipment is significantly higher compared to single-use equipment. Many factors, however, contribute to the manufacturing costs for vaccine manufacturing and the observations in terms of the operating cost per dose for the vaccine manufacturing facilities in other circumstances may significantly differ. This study’s results did indicate that process flexibility can potentially improve the performance of a facility utilising stainless-steel equipment. It is, however, required that aspects such as regulatory approval, equipment capabilities, and capital costs are considered to determine the feasibility of a flexible stainless-steel equipment facility. This study can inform the decision on whether to further investigate the feasibility of incorporating process flexibility in a manufacturing facility utilising stainless-steel. The model developed in this study could be adjusted to investigate other research problems associated with process flexibility in vaccine manufacturing systems. Three examples of alternative applications have been identified. One of these applications involves investigating a facility that continuously manufactures a routine vaccine product, while some of the manufacturing capacity is reserved for shifting between different epidemic products. The demand for these products will fluctuate based on epidemiological outbreaks.
- ItemReal-time cloud-based stochastic scheduling incorporating mobile clients and a sensor network(Stellenbosch : Stellenbosch University, 2016-03) McOnie, Cameron; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: Scheduling within manufacturing environments is often complicated due to the complex, dynamic and stochastic characteristics such environments exhibit. These characteristics pose problems for off-line scheduling techniques as schedules, initially determined to be acceptable, may degrade or even become infeasible as the state of the system changes. On-line techniques attempt to address this challenge by performing scheduling concurrently with the manufacturing system. By reacting to system disturbances in real-time, on-line schedulers are capable of producing better schedules, or schedule control laws, when compared to off-line techniques. This study proposes a software architecture for a simulation-based reactive scheduling system. The architecture addresses what the main components of a reactive scheduler are and how they are related. Furthermore, it describes each of the components from multiple viewpoints, i.e., logical, process, development, and deployment|predominantly using the unified modelling language. The design decisions used to arrive at architecture qualities such as scalability, modularity, and interoperability are also discussed. Particular attention is given to defining a service contract between the back-end of a reactive scheduling system and data capture and decision support devices located on the shop floor. The proposed architecture is applied through the construction of a simulationbased reactive scheduling system, capable of reacting to real-time disturbances. The base of the system is a simulation model of a pressure gauge assembly operation. Interaction with the simulation model is done through a scheduling application server. The system also comprises of a sensor network prototype, used as means of tracking the movement of work-in-process through the assembly operation; and a mobile client, used to communicate decision support data back to the shop floor. The scheduling application server is deployed to the cloud and is exposed as a Web service for shop floor devices to consume. An experiment that compares the effect of rescheduling using dispatching rules on the system over time is performed. It is shown that as the system state progresses, the recommended dispatching rule may change, and therefore, by embedding the associated control law into the shop floor, would result in an improvement of the manufacturing objective. This experiment illustrates the value of reactive scheduling in the presence of real-time events.
- ItemRequirements specification for the optimisation function of an electric utility's energy flow simulator(Stellenbosch : Stellenbosch University, 2015-03) Hatton, Marc; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept of Industrial Engineering.ENGLISH ABSTRACT: Efficient and reliable energy generation capability is vital to any country's economic growth. Many strategic, tactical and operational decisions take place along the energy supply chain. Shortcomings in South Africa's electricity production industry have led to the development of an energy ow simulator. The energy ow simulator is claimed to incorporate all significant factors involved in the energy ow process from primary energy to end-use consumption. The energy ow simulator thus provides a decision support system for electric utility planners. The original aim of this study was to develop a global optimisation model and integrate it into the existing energy ow simulator. After gaining an understanding of the architecture of the energy ow simulator and scrutinising a large number of variables, it was concluded that global optimisation was infeasible. The energy ow simulator is made up of four modules and is operated on a module-by-module basis, with inputs and outputs owing between modules. One of the modules, namely the primary energy module, lends itself well to optimisation. The primary energy module simulates coal stockpile levels through Monte Carlo simulation. Classic inventory management policies were adapted to fit the structure of the primary energy module, which is treated as a black box. The coal stockpile management policies that are introduced provide a prescriptive means to deal with the stochastic nature of the coal stockpiles. As the planning horizon continuously changes and the entire energy ow simulator has to be re-run, an efficient algorithm is required to optimise stockpile management policies. Optimisation is achieved through the rapidly converging cross-entropy method. By integrating the simulation and optimisation model, a prescriptive capability is added to the primary energy module. Furthermore, this study shows that coal stockpile management policies can be improved. An integrated solution is developed by nesting the primary energy module within the optimisation model. Scalability is incorporated into the optimisation model through a coding approach that automatically adjusts to an everchanging planning horizon as well as the commission and decommission of power stations. As this study is the first of several research projects to come, it paves the way for future research on the energy ow simulator by proposing future areas of investigation.
- ItemShadow sort - a hybrid non-dominated sorting algorithm(Stellenbosch : Stellenbosch University, 2022-04) Trankle, Nicholas Albert; Bekker, James; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.ENGLISH ABSTRACT: In this study a novel, hybrid, non-dominated sorting algorithm called Shadow Sort is presented. The algorithm was developed bearing in mind real-world, practical requirements of non-dominated sorting algorithms. The majority of non-dominated sorting algorithms are employed in conjunction with a multi-objective optimisation algorithm, many of which make use of population-based meta-heuristic techniques. The outputs of each population-based meta-heuristic iteration typically include hundreds, if not thousands of solutions, which need to be sorted in order to prime the next evolutionary iteration. However, of all the solutions proposed by any given metaheuristic iteration, a relatively low number of those solutions are of any use. Therefore, Shadow Sort approaches all non-dominated sorting by eliminating dominated solutions as early as possible, rather than processing all solutions in order to nd their respective Pareto set. Shadow Sort was developed in Matlab, and is rstly compared to other non-dominated sorting algorithms, namely Deductive Sort, E - cient Non-dominated Sort and Best Order Sort. Next, a use-case test is performed where Shadow Sort is compared to the best-performing competitor non-dominated sorting algorithm by implementing both into state-of-the-art multi-objective evolutionary algorithms. Several cases of Shadow Sort are proposed based on the number of objectives in the population. It was found that Shadow Sort is competitive when optimising two and three objectives, for various population sizes.