This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.
Data mining consists of five major elements:
Extract, transform, and load transaction data onto the data warehouse system.
Store and manage the data in a multidimensional database system.
Provide data access to business analysts and information technology professionals.
Analyze the data by application software.
Present the data in a useful format, such as a graph or table.
Data mining agent
A data mining agent is a software program built for the primary purpose of finding information efficiently. It is a type of intelligent agent that operates in a data warehouse and does the actual dirty work involved in finding sometimes less than obvious relationships between different pieces of data. This type of agent is able to detect major trend changes, as well as detect new pertinent information. If a new piece of information is found, the agent will generally attempt to alert the end-user of the new information.
For example, a corporation may develop an agent to analyze economic trends. If the agent detects that consumers are becoming more conservative, it will alert management of the change. With this information in hand, management can better plan on how to produce, market, and sell its product. It will make all the process efficiently.
Data Mining Process and Agents
In several steps through knowledge discovery, which include data preparation, mining model selection and application, and output analysis, intelligent agent paradigm can be used to automate the individual tasks. In data preparation, agent use can be especially on sensitivity to learning parameters, applying some triggers for database updates and handling missing or invalid data.
In data mining model, the agent-based studies are implemented for classification, clustering, summarization and generalization which have learning nature and rule generation since current learning methods are able to find regularities in large data sets. An intelligent agent can use domain knowledge with embedded simple rules and using the training data it can learn and reduce the need for domain experts. In the interpretation of what is learned, a scanning agent can go through the rules and facts generated and identify items that can possibly contain valuable information.
Data preparation in data mining involves data selection, data cleansing, data preprocessing, and data representation . With the use of intelligent agents, several of these steps can possibly be automated. One possibility for automating the data selection step is to perform automatic sensitivity analysis to determine which parameters should be used in learning. This would reduce the dependency of having domain expert available to examine the problem every time something changes in the environment.
Data cleansing could be automated through the use of an intelligent agent with a rule base. When a record is added or updated in a relational database, a trigger could call the intelligent agent to examine the transaction data. The rules in its rule base would specify how to cleanse missing or invalid data.
Data preprocessing also requires domain knowledge, since there is no way to know the semantics of the attributes and relationships like computed or derived fields. However, more standard preprocessing and data representation steps such as scaling or dimensionality reduction, symbol mapping, and normalization, which are usually specified by the data mining expert, could be automated using rules and basic statistical information about variables.
Searching for patterns of interest by using learning and intelligence in classification, clustering, summarization and generalization can also be accomplished by intelligent agents. An agent can learn from a profile or from examples and feedback from user can be used to refine confidence in agent's predictions. An intelligent agent can use domain knowledge with embedded simple rules and using the training data it can learn and reduce the need for domain experts. In the understanding of what is learned, agent use can be only as a fixed-agent or simply a program in visualization.
The major advantage of using intelligent agents in automation of data mining is indicated as their possible support for online transaction data mining. When new data is added to the database, an alarm or triggering agent can send events to the main mining application and to the learning task in it, so that new data can be evaluated with the already mined data. This automated decision support using triggers in data mining is called as "active data mining". Since, the main mining functions can be performed by using learning methods, the implementation and application of these methods by using intelligent agents will provide flexible, modular and delegated solution. Additionally, this paradigm can be used in the parallelization of the data mining algorithms according to its usability in distributed environments.
The architecture is composed of modules.
The Storage Data Module (SDM): data is prealably stored in data marts. This module permits loading of data.
The Hierarchy Concepts Module (HCM): recover the hierarchies of concepts already built.
The Data Mining Module (DMM): it can be subdivided into two sub-modules : the first for decisional extraction models and the second for recover constructed models.
The Assistant Module (AM): dedicated to user assistance essentially the novice user in the data mining phase.
Ontologies Module: a tool of ontolgy conception, help the user to define these ontologies and communicate the results to the ontology agents.
There will be three profiles of users:
An assistant agent: which guides the user in his choices. Distinguished user profiles are: novice, connoisseur and expert.
A novice user: is a user who reaches the system for the first time and on which no history does exist. The system must guide him and provide him prepared solutions by calling upon the agent assisting Â·
An expert user: can be the administrator. He could build a new Data Mart starting from the data sources, to prepare the requests and results of extraction of knowledge for a later use.
A connoisseur user has an intermediary profile. It indicates that this user profile has knowledge of the field and can identify his needs. The system must give him the possibility of formulating its requests. If he could not be satisfied, an assistant is called.
The design starts with the definition of the agents which compose it and the modes of interactions between them. The various agents identified on this level are the following:
An evaluator agent: the user needs to choose the best result which meets its needs.
A comparison agent: who deals with comparing the various results obtained starting from the various launched algorithms.
A coordinator agent: the data have been stored in data marts. This agent will extract data to provide hem to the agents.
One or more agents for data mining: carry out the algorithms of data mining according to the choice which was made. There will be thus an agent generalization, an agent classification, an agent characterization these various agents will be carried out at the same time to allow the agent appraiser to do his work.
An interface agent: who according to profiles of the user will play the role of interface with on the one hand the agent dated mining and on the other hand the software that will be used to display obtained results. It proposes to the user according to his profiles the possibility of formulating his requests and of displaying the result in the form of rules.
These agents can be used in a distributed environment whereby agents communicate no matter where they are located.
Petz (Dogz and Catz) is a series of games dating back to 1995; users can invite autonomous agents to lay on their PC desktop. A user might interact with her PETZ by petting them, introducing them to one another at different ages, feeding (or not feeding) them, and so forth. Unlike CREATURES, users do not have direct access to any sort of slider-bar tweaking of internals. In this way, a user's PETZ are truly autonomous. The PETZ agents use real-time animation and the layering display of multiple simultaneous behaviors to create the illusion of lifelike continuous motion .
PETZ graphic behaviors mimic those simple behaviors we expect of real pets: DOGZ, for example, wag their tails; have perky ears and expressive eyes; have noses that follow "scents"; have spots in which they like to be petted; and have tongues that can be used to express being tired, excited, thursty, and hungry.
PETZ have persistent personalities, but these can change over time if, for example, a pet is not fed appropriately (pets come with virtual food and virtual toys). Additionally, users can modify the behavior of their PETZ by squirting them with a virtual spray bottle.
Software Agents in the Planning, Monitoring and Optimizing Travel
The standard approach to planning business trips is to select the flights, reserve a hotel, and possibly reserve a car at the destination. The choices of which airports to fly into and out of, whether to park at the airport or take a taxi, and whether to rent a car at the destination are often made in an ad hoc way based on past experience. These choices are frequently suboptimal, but the time and effort required to make more informed choices usually outweighs the cost. Similarly, once a trip has been planned it is usually ignored until a few hours before the first flight. A traveler might check on the status of the flights or use one of the services that automatically notify a traveler of flight status information, but otherwise a traveler just copes with problems hat arise, as they arise. Beyond flight delays and cancellations there are a variety of possible events that occur in the real world that one would ideally like to anticipate, but again the cost and effort required to monitor for these events is not usually deemed to be worth the trouble. Schedules can change, prices may go down after purchasing a ticket, flight delays can result in missed connections, and hotel rooms and rental cars are given away because travelers arrive late.
Agent Access to Online Sources
In order to provide access to the data in an existing HTML source, a wrapper is constructed. A wrapper is simply a program that understands the structure of a specific web site and uses that knowledge to accept queries to that site and produce answers to those queries in a structured format, such as XML. Information can then be retrieved for different web sited to get trip information
Interactive Planning of a Trip
A general, interactive, constraint-based planner, called Heracles was developed, which can then applied to the problem of planning travel. The resulting system integrates a wide variety of travel related data from web sources to provide the data that travelers need to plan a trip (Ambite et al. 2002). This system uses information agents described in the previous section to provide real-time access to the many online sources related to travel. A traveler enters their origin and destination addresses and the dates of his/her trip and then the travel planner interactively helps the traveler plan the trip. The system provides the choices of flights, hotels, ground transportations, etc. For each decision, the system makes a recommendation that optimizes a user-specified criterion, such as minimizing the overall cost of a trip.
Agents for Monitoring Travel
As part of the Electric Elves project (Chalupsky et al. 2001; Ambite et al. 2002) agent technologies were applied to build a set of agents for various monitoring tasks. In the case of monitoring travel plans, this task is particularly well-suited for applying agent technology for several reasons: a) this is a fairly complicated task with many possible forms of failure ranging from flight cancellations and schedule changes to hotel rooms being given away when a traveler arrives late at night, b) there are a large number of online resources that can be exploited to anticipate problems and keep a traveler informed, and c) these tasks would be tedious and impractical for a human to perform with the same level of attention that could be provided by a set of software agents.
To deploy a set of agents for monitoring a planned trip, the user first enters the travel itinerary as described in the previous section and then specifies which aspects of the trip he/she would like to have the agents monitor. A set of information agents are then spawned to perform the requested monitoring activities. For the travel planning application, we developed the following set of agents to monitor a trip:
An airfare-monitoring agent that tracks the current price of a flight itinerary.
A schedule-change agent that keeps track of the published schedule for a given flight itinerary and notifies a traveler of here is any change to this itinerary.
A flight-status agent that continually monitors the status of a flight. This agent also sends a fax to the hotel if the flight arrival is delayed past 5pm in order to ensure that the hotel room is held for the traveler.
An earlier-flight agent that checks for flights that will depart before the scheduled flight.
A flight-connection agent that monitors a traveler's connecting flights, checks the status and gate information of the connecting flights, and checks for earlier flights to the same destination.
A restaurant-finding agent that locates the closest restaurants based on the traveler's GPS location.
Autonomous Agents as Synthetic Characters
These Agents are those that understand social relationships, maintain histories with users, have some knowledge of human emotion, are beginning to understand Human speech, can speak themselves, and have control over media channels to deliver morphing faces, music, and theater-quality sound-all responsively and in real time and have tremendous inherent attachment-forming capabilities. Additionally, these agents, by definition, are at least partially autonomous. They might well live on after the user walks away from the terminal and might form relationships with other users. In short, they have their own synthetic lives.
Example SAM which is an Affective reasoning system that has emotion capabilities.