Generic Algorithm is commonly used in an optimization problem
A Generic Algorithm is commonly used in an optimization problem in artificial intelligence, while Standard is used to solving general problems. In a generic algorithm, the problem is encoded in a series of a bit string is manipulated later by the Algorithm. On the other hand, a generic Algorithm uses formulas applied directly in the spreadsheet cells. Genetic algorithms are in the class of “direct search” methods, which are used for optimization. The main distinction between genetic algorithms and other standard approaches is how the points are selected. First, a genetic algorithm begins with at least 2 points, and the next points are chosen by merging or transforming methods, those with the best result. Many standard algorithms have varying ways to collect the next points. The best approach is to randomly select dots. Some methods use a fixed pattern to select locations nearby. Methods of pattern analysis have the advantage that you can show that you can get an optimum result for such models.
Since our computer’s computational power was considerably higher than a PC’s power a few years ago, you can now quickly formulate and solve the complex multi-objective optimization problem. Genetic algorithms are some heuristics, and adaptive computations can use some evolutionary algorithms to operate on the algorithm space, rather than the search space for solutions
Question 2
The general process of stimulation involves:
- Definition of the problem: The initial steps involve defining the objective and goals of the study than the description of the observation of the objective.
- Planning of the project: the project is broken down into workable packages and a team assigned to each fragment. Milestones are set to monitor the progress of the project
- Definition of systems: it involves the identification of the components of the system to be modeled and the measure of performance to be analyzed.
- Formulation of Model: to understand the essential operation of the actual process, a model is required to determine the basic requirements in the process.
- Data collection and analysis: the type of data to be collected is determined. Then data is fitted into the theoretical distribution.
- Translation of model: interpretation of the model into a programming language is made—the range of choices rage from general language example of Fortran to simulation program example of Arena.
- Verification and validation: this step ensures that the model works as intended, which is done either by animation or debugging.
- Experimentation and analysis: this process involves the development of alternative model simulation, execution, and comparison of the available alternatives.
- Documentation and implementation: this consists of the report and presentation. It also contains the outcomes of the study, the implication, and the best course of action is identified as recommended and justified.
Exercise 1
Web-based simulation has many benefits in comparison to classical systems. Web-based simulation is the ease of use; the act of simulation model is acceptable in the simulation community. Collaboration is essential in the development of a successful simulation project. Prints for license and implementation. Regardless of how it was created, the tool can be deployed locally, in an intranet environment, or over the Internet. Reuse pattern. Since the Web-based simulation appears at the Winter Simulation Conference. Capability across platforms. The Internet enables an application to be run on any browser on any operating system without optimizing it. Controlled attainment. Access to a Web-based simulation program can be managed by using passwords, and restricted time-span access can be allocated. Broad accessibility. A Web-based simulation program can be used with an Internet connection from anywhere in the world and beyond regular business hours without having to ship network equipment. Interoperability and convergence. A Web-based platform can incorporate and communicate with current and future Web-based applications, as well as Web-based desktop applications. Small graphical user interface. The Web interface is limited in comparison to desktop simulation tools. Vulnerability in Protection. Web-based applications are more vulnerable to malicious attacks than applications running on clients
Chapter 11
Question 1
Why are automated decision systems so important for business applications?
- It reduces the operational costs: one approach of automated decision system is reducing the operational cost and increasing profitability
- It increases productivity: for an organization to grow, productivity is a significant concern. IT software has provided significant gains in the HR environment.
- Ensuring high availability: Companies are ever more dependent on their computers. Online systems are regularly used for daily business: purchase entry, appointments, assembling guidance, shipment orders-the the list continues. If the machine is not available, the organization will suffer.
- To increase reliability: Reliability is the real treasure that makes automation shine. This is the backbone of any competent department of IT operations, and without it, you have uncertainty, confusion, and users that are dissatisfied.
- To optimize performance: Each organization wants its business to perform like a racehorse. Work is more likely to be weighed down. While technological developments make them faster and less costly each year, the strains on them still catch up and ultimately surpass the level of functionality that the technical capacity of an organization possesses, which leaves many businesses looking to boost the efficiency of their systems.
Question 2
It is said that powerful computers, inference capabilities, and problem-solving heuristics are necessary but not sufficient for solving real problems. Explain.
This is because every problem makes use of different knowledge, which requires a varied approach to solve a given situation. Problem-solving is heuristics, which have been there but have improved with time. The three factors are necessary for solving real problems; what is important is changing the key variables from point to point. Apart from the three elements, other factors need to be taken into account. To account for how human and emotional factors intervene and alter the situations, and to test if the approaches found by the three above variables are achievable and favorable.
Question 3
Explain the relationship between the development environment and the consultation (i.e. ., runtime) environment.
In its purest form, environmental assessment is a planning method that is now widely recognized as an integral component of sound decision-making. The development environment involves the activities and resources required for obtaining and portraying the information, as well as for drawing valid conclusions and justifying it. Knowledge engineers and domain experts, who act as builders, are the major players in this area. Upon completion of the program, the non-expert user for consultancy through the consultation environment uses it. Both include numerous public comment opportunities when deciding whether an EA is needed, setting the scope of environmental review, and during the technical analysis. Although both regimes try to prevent damage to the environment, there is no assured concrete outcome of either. They are underlying the assumption that procedure can affect decision-making without a specific assurance. Both structures are based on the transformative potential of good faith negotiations, and the assumption that processes will alter outcomes, founded upon a base of active public engagement. Consultation systems and EA systems fail without active and dedicated stakeholder participation. They are strongly influenced by politics. Under the federal framework, it is political to agree to undertake a project despite substantial environmental damage, and despite any concerns about the adequacy of consultation.
Question 4
Explain the difference between forwarding chaining and backward chaining and describe when each is most appropriate.
As the name suggests, forward chaining begins from the known facts and moves forward by applying inferential rules to obtain further data, and continues until it hits the target. In contrast, backward chaining starts from the goal, moves backward by using inferential regulations to decide the facts that match the objective. Forward chaining is called the method of data-driven induction, whereas backward chaining is called the technique of goal-driven inference. A breadth-first search strategy is used in forwarding chaining, while a depth-first strategy search is used in backward chaining. Both methods apply Modus ponens inference rule. Forward chaining is slow as it checks for all the guidelines, while backward chaining is quick as it only searches a few of the regulations needed. There may be specific ASK questions from the knowledge base in forward chaining, while there may be fewer ASK questions in backward chaining. Forward chaining can be like an extensive quest, while reverse chaining tries to escape the needless direction of reasoning. Forward chaining can be used for activities such as preparation, control of design processes, analysis, and sorting, while backward chaining can be used for classification and diagnosis.
Exercise 4
Artificial intelligence for procurement and supply chain in ABC business, which had a problem of having accurate data. This made the purchase and procurement of based on the wrong data leading to an excess of some inventory and shortage of other stockpiles. Problems with commodity data stem from the specific information structure and norms of this kind. Each manufacturer may have a particular number of items, which could include inconsistent or non-standard details in the description fields. Such systems encapsulate the processes that drive a company every day, but usually, they do not have integrated data quality capabilities to identify and eradicate bad data.
Issues including repetitive inventory numbers, outdated inventory IDs, and contradictory item details exist in the enterprise, affecting every aspect of the project. An unable to understand the goods being marketed will significantly impair the ability of the company to prepare for new products in the future. The bottom line is that data from inferior quality products causes difficulties in managing manufacturing costs, reduces the company’s profitability, and affects the distribution of final products. After all, the information inside your frameworks powers any decision you make, from long-range tactical planning to standard procedures.
With data mining, Artificial intelligence can provide a reliable insight for decision-makers. AI has given a useful insight into customers to companies, which helps them increase contact with customers. It also allows retailers to anticipate and react quickly to demand the product. In the end, the mining of thoughts helps businesses understand why people feel the way they do. Very often, the issues of a single customer can be may, amongst others. If enough viewpoints are correctly collected and evaluated, the data collected will help entities gage and forecast the silent majority’s concerns. Through automation, which is faster and more efficient, AI has enhanced this mining process, assisting organizations in making critical business decisions. With data mining, ABC business can collect data of inventories and track the flow of stock, thus making the right procumbent of lists.