# Particle Swarm Optimization For Job Shop Scheduling Computer Science Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The Job-Shop Scheduling Problem (JSSP) is one of the most difficult problems, as it is classified as NP-Hard problem. In many cases, the combination of goals and resources exponentially increases the search space, and thus the generation of consistently good scheduling is particularly difficult because we have a very large combinatorial search space and precedence constraints between operations. Exact methods such as the branch and bound method and dynamic programming take considerable computing time to obtain the optimum solution. We propose a new method for solving JSSP using hybrid Particle Swarm Optimization (PSO) with Simulated Annealing and demonstrate its efficiency by the standard benchmarks of job-shop scheduling problems. Some important points of PSO are how to represent the schedules as particles and to design the operations for the representation in order to produce better results. Our experimental result shows that the new proposed job-shop scheduling algorithm is more robust and efficient than the existing algorithms.

Keywords: Job-Shop Scheduling Problem, Particle Swarm Optimization, NP-Hard Problem, Simulated Annealing

## Introduction

The Job-Shop Scheduling Problem (JSSP) is one of the existing combinatorial optimization problems and it has been demonstrated to be an NP-hard problem. In the job-shop scheduling problem, each one of n jobs (n1) must be processed passing through m machines (m1) in a given sequence. (Garey, 1976). The sequence of m machines is different for each different job and cannot be changed during the processing. When one job was processed on a machine, it can be considered as one operation, each job needs a combination of m operations (oj1,oj2,â€¦,ojm) to complete the work. One operation is processed on one of m machines, and just only one operation can be processed at a time. Any job cannot interrupt the machine that is processing one operation of another job. Each machine can process at most one operation at the same time. The main objective of the JSSP is to find a schedule of operations that can minimize the maximum completion time (called makespan) that is the completed time of carrying total operations out in the schedule for n jobs and m machines.

JSSP can be applied to the manufacture processing and effects really the production time and the cost of production for a plant. During the past few decades, JSSP has attracted many researchers to develop algorithms. Because JSSP is an NP-hard problem, it is difficult to develop a perfect algorithm to find a solution within a reasonable time especially for higher dimensions. Recently, many researchers made use of evolution algorithm to solve the problem, such as Tabu search, Genetic Algorithm, Simulated Annealing, Ant Colony Optimization and Particle Swarm Optimization. In this paper, we focus on exploiting particle swarm optimization algorithm to achieve the better solution for JSSP.

Scheduling is broadly defined as the process of assigning a set of tasks to resources over a period of time (Pinedo et al. 1999). Effective scheduling plays a very important role in today's competitive manufacturing world. Performance criteria such as machine utilization, manufacturing lead times, inventory costs, meeting due dates, customer satisfaction, and quality of products are all dependent on how efficiently the jobs are scheduled in the system.

Several types of manufacturing shop configurations exist in real world. Based on the method of meeting customer's requirements they are classified as either open or closed shops. In an open shop the products are built to order where as in a closed shop the demand is met with existing inventory. Based on the complexity of the process, the shops are classified as single-stage, single-machine, parallel machine, multi-stage flow shop and multi-stage job shop (Lawler et al. 1993). The single-stage shop configurations require only one operation to be performed on the machines. In multi-stage flow shops, several tasks are performed for each job and there exists a common route for every job. In multi-stage job shops, an option of selecting alternative resource sets and routes for the given jobs is provided. Hence the job shop allows flexibility in producing a variety of parts.

The processing complexity increases as we move from single stage shops to job shops. Various methods have been developed to solve the different types of scheduling problems in these different shop configurations for the different objectives. These range from conventional methods such as mathematical programming and priority rules to meta-heuristic and artificial intelligence-based methods (Holland, 1992).

Job shop scheduling is one of the widely studied and most complex combinatorial optimization problems. A vast amount of research has been performed in this particular area to effectively schedule jobs for various objectives. A large number of small to medium companies still operate as job shops (Lin et al. 2009). Despite the extensive research carried out it appeared that many such companies continue to experience difficulties with their specific job shop scheduling problems. Therefore developing effective scheduling methods that can provide good schedules with less computational time is still a requirement. Most of the real world manufacturing companies aim at successfully meeting the customer needs while improving the performance efficiency.

Lin et al. (2009) proposed a new hybrid swarm intelligence algorithm consists of particle swarm optimization, simulated annealing technique and multi-type individual enhancement scheme is presented to solve the job-shop scheduling problem. And also it described about the basics of Job-shop Scheduling Problem.

Huiyuan et al. (2009) established a dual resource (machines and moulds) constrained job shop scheduling problem model, according to the actual factors of mass injection molding processing job shop scheduling. A heuristic active algorithm combined with priority rules is employed to give the solution. Bozejko et al. (2009) developed a Job Shop Scheduling Problem using Parallel Simulated Annealing Technique. Juang (2004) proposed a hybrid of Genetic Algorithm and Particle Swarm Optimization for Recurrent Network Design. In this paper the author described how the hybridization of GA with PSO overcomes each other's disadvantages.

Ge et al. (2007) presented a new hybrid algorithm based on particle swarm optimization and simulated annealing for job shop scheduling problem. Beasley (1990) was developed an OR-Library and it is a collection of test data sets for a variety of Operations Research (OR) problems. The benchmark problems are taken from this OR-Library. Several researchers have addressed the importance of solving Job-shop Scheduling Problems, which will help in solve real world problems in industries and to schedule their jobs perfectly.

Particle swarm optimization (PSO) is developed by Kennedy and Eberhart in (1995). The position of one particle is corresponding to a solution of the solving problem. Liking a bird that flies to the food, one particle moves its position to a better solution by according to the best particle's experience and its own experience. Every particle moves iteratively until the end of iterations. We call this process as evolution process. At the end of iterations, the position of best particle is the best solution of the solving problem. The original developed PSO is designed to search solution in a continuous space. Because PSO's local search ability is weaker than global searching ability, in order to get better solution, some local search schemes should be integrated with the PSO. The proposed algorithm enhances the particle's searching ability and is suitable to solve the JSSP.

## Job-Shop Scheduling Problem

Job-Shop Scheduling Problem (JSSP) is among the hardest combinational problems and it has been the subject of a significant amount of literature in the Operations Research areas. The JSSP consists of n jobs and m machines. Each job must go through m machines to complete its work. We consider one job consists of m operations. Each operation uses one of m machines to complete one job's work for a fixed time interval. Once one operation is processed on a given machine, it cannot be interrupted before it finishes the job's work. The sequence of operations of one job should be predefined and may be different for any job. In general, one job being processed on one machine is considered as one operation noted as Oji (means jth job being processed on ith machine, 1jn,1iâ€²m) (Garey et al. 1976 and Lawler et al. 1993). Each machine can process only one operation during the time interval. The objective of JSSP is to find an appropriate operation permutation for all jobs that can minimize the makespan Cmax, i.e., the maximum completion time of the final operation in the schedule of nÃ-m operations.

For an nÃ-m JSSP, the problem can be modeled by a set of m machines, denoted by M={1,2,â€¦,m}, to process a set of nÃ-m operations, denoted by O = {0,1,2,â€¦,(nÃ-m)-1} (Lin et al. 2009). The notations are as follows:

N

: number of jobs

M

: number of operations for one job

Oi

: completed time of operation i (i=0,1,2,â€¦â€¦(nÃ-m)-1)

ti

: processing time of operation i on a given machine

Cmax

: makespan

The JSSP has n jobs to be processed on m machines and we assume the following:

No machine may process more than one job at a time;

No job may be processed by more than one machine at a time;

The sequence of machines which a job visit is completely specified and has a linear precedence structure;

Processing times are known;

All jobs must be processed in each machine only one time.

## Particle Swarm Optimization

Swarm Intelligence (SI) is an innovative distributed intelligent paradigm for solving optimization problems that originally took its inspiration from the biological examples by swarming, flocking and herding phenomena in vertebrates. Particle Swarm Optimization (PSO) incorporates swarming behaviors observed in flocks of birds, schools of fish, or swarms of bees, and even human social behavior, from which the idea is emerged (Kennedy, 2001) (Clerc , 2002).

Particle swarm optimization (PSO) is an evolutionary computation technique developed by Kennedy and Eberhart in 1995 (Eberhart R. C, 1995). The particle swarm optimization concept originated as a simulation of a simplified social system. The original intent was to graphically simulate the graceful but unpredictable choreography of a bird flock. Initial simulations were modified to incorporate nearest-neighbor velocity matching, eliminate ancillary variables, and incorporate multidimensional search and acceleration by distance (Eberhart R. C, 1995). At some point in the evolution of the algorithm, it was realized that the conceptual model was, in fact, an optimizer. Through a process of trial and error, a number of parameters extraneous to optimization were eliminated from the algorithm, resulting in the very simple original implementation (Eberhart R.C, 1996).

The original PSO formulae define each particle as potential solution to a problem in D-dimensional space. The position of particle i is represented as

Each particle also maintains a memory of its previous best position, represented as

A particle in a swarm is moving; hence, it has a velocity, which can be represented as

PSO is similar to a genetic algorithm (GA) in that the system is initialized with a population of random solutions. It is unlike a GA, however, in that each potential solution is also assigned a randomized velocity, and the potential solutions, called particles, are then "flown" through the problem space. Each particle keeps track of its coordinates in the problem space which are associated with the best solution (fitness) it has achieved so far. (The fitness value is also stored.) This value is called pbest. Another "best" value that is tracked by the global version of the particle swarm optimizer is the overall best value, and its location, obtained so far by any particle in the population. This location is called gbest.

The particle swarm optimization concept consists of, at each time step, changing the velocity (accelerating) each particle toward its pbest and gbest locations (global version of PSO). Acceleration is weighted by a random term, with separate random numbers being generated for acceleration toward pbest and gbest locations. There is also a local version of PSO in which, in addition to pbest, each particle keeps track of the best solution, called lbest, attained within a local topological neighborhood of particles. The (original) process for implementing the global version of PSO is as follows:

Initialize a population (array) of particles with random positions and velocities on d dimensions in the problem space.

For each particle, evaluate the desired optimization fitness function.

Compare particle's fitness evaluation with particle's pbest. If current value is better than pbest, then set pbest value equal to the current value and the pbest location equal to the current location in d-dimensional space.

Compare fitness evaluation with the population's overall previous best. If current value is better than gbest, then reset gbest to the current particle's array index and value.

Change the velocity and position of the particle according to equations (1) and (2), respectively:

Loop to step 2 until a criterion is met, usually a sufficiently good fitness or a maximum number of iterations (generations).

The acceleration constants c1 and c2 in equation (1) represent the weighting of the stochastic acceleration terms that pull each particle toward pbest and gbest positions. Thus, adjustment of these constants changes the amount of "tension" in the system. Low values allow particles to roam far from target regions before being tugged back, while high values result in abrupt movement toward, or past, target regions.

Early experience with particle swarm optimization (trial and error, mostly) led us to set the acceleration constants c1 and c2 each equal to 2.0 for almost all applications. Vmax was thus the only parameter routinely adjusted, and often set it at about 10-20% of the dynamic range of the variable on each dimension.

Based, among other things, on findings from social simulations, (Eberhart R.C, 1998) it was decided to design a "local" version of the particle swarm. In this version, particles have information only of their own and their neighbors' bests, rather than that of the entire group. Instead of moving toward a kind of stochastic average of pbest and gbest (the best location of the entire group), particles move toward points defined by pbest and "lbest," which is the index of the particle with the best evaluation in the particle's neighborhood. If the neighborhood size is defined as two, for instance, particle (i) compares its fitness value with particle (i-1) and particle (i+1).

## Simulated Annealing

Simulated Annealing locates a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution.

Simulated Annealing (SA) is a random search technique that originates from the analogy between annealing process and the search for the minimum in a general process. It was first developed by Kirkpatrick, et al. (1983) who adapted the work of Metropolis, et al. (1953) to constraint optimization problems. The SA algorithm starts with a randomly generated set of initial solutions and at a high starting temperature 'T'. The algorithm replaces the present solution with a solution from its neighborhood if that solution is "better" than the current one. A "better" solution in this algorithm could be the one whose objective function value is less / greater (minimization problem/maximization problem) than the current value, or sometimes even a value that is greater / lesser than the current one.

The latter solutions are accepted with a probability. The probability is a function of difference in objective function values and the temperature T. The value of temperature gradually decreases during the search process, thereby the solutions are replaced more number of times at the beginning and less replacements occur towards the end of the process. The above steps are repeated until a termination criterion is reached. This heuristic has a good ability to avoid the solutions being trapped in local minima because of the inclusion of the probability function. But the major shortcoming of this algorithm lies in the high computational cost for obtaining an exact solution (Wu and Wang, 1998). Research on literature provides papers on the modification of this algorithm to improve the solution accuracy and convergence.

Figure.1. Simulated Annealing Flow chart

## Algorithm

Start with the system in a known configuration, at known energy E

T=temperature =hot; frozen=false;

While (! frozen) {

repeat {

Perturb system slightly (e.g., moves a particle)

Compute E, change in energy due to perturbation

If(âˆ†E < 0 )

Then accept this perturbation, this is the new system config

Else accept maybe, with probability = exp(-âˆ†E/KT)

} until (the system is in thermal equilibrium at this T)

If(âˆ†E still decreasing over the last few temperatures)

Then T=0.9T//cool the temperature; do more perturbations

Else frozen=true

## }

return (final configuration as low-energy solution)

## Problem Representation

Consider the following 3x3 Job-Shop Scheduling Problem. The Operation times and machine ordering is shown in the following tables.

Table1. 3x3 JSSP Table2. Machine Allocation Table 3. Operation Times

O1

O2

O3

O4

O5

O6

O7

O8

O9

J1 (O1,O2,O3)

2

3

4

J2(O4,O5,O6)

1

5

2

J3(O7,O8,O9)

4

6

4

## M3

O1

O3

O2

O5

O4

O6

O8

O9

O7

The feasible Schedule for the above JSSP is 1,4,7,8,2,5,3,9,6. That is the operations are performed in the machines as of in the above schedule gives the makespan as 17, which is feasible result for the give JSSP. The Gantt chart of the schedule 1, 4, 7, 5, 2, 8, 3, 6, 9 is given in Fig 2. In the proposed method we have taken the schedules as the particles to solve the JSSP.

Time Slot

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

## 17

M1

O1/2

O8/6

O5/5

M2

O4/1

O3/4

O9/4

M3

O7/4

O2/3

O6/2

Figure.2. Gantt chart for Feasible Schedule

The searching space is created in nÃ-m dimensions space for n jobs on m machines Job-Shop Scheduling Problem. The problem solution is represented as a particle. The positions of a particle consists of nÃ-m dimensions and is represented with nÃ-m real numbers. In order to simulate an operation permutation sequence of JSSP, the nÃ-m real numbers are transformed into an integer series from 1 to nÃ-m. Each integer number represents one operation index (Oji,1jn,1iâ€²m) according to its ordering in all nÃ-m real numbers. Fig. 3 illustrates an example of a representation of particle for a 3Ã-3 JSSP.

Particle

30.4

2.5

41.2

0.6

5.3

7.5

6.4

3.4

8.2

Integer Series (ï°k)

8

2

9

1

4

6

5

3

7

(ï°k mod 3)+1

3

3

1

2

2

1

3

1

2

Operation Sequence

O31

O32

O11

O21

O22

O12

O33

O13

O23

Figure.3. Problem Representation.

## Objective Function

The main objective of the JSSP is to find a schedule of operations that can minimize the maximum completion time (called makespan), that is the completed time of carrying total operations out in the schedule for n jobs and m machines. The Objective or fitness function takes the input as the number of jobs, number of Operations, Particle, Operation time Sequence and Machine Sequence of the corresponding operation. Each Particle is assigned by an integer number (ï°k) by ranking the particle positions (real numbers) in ascending order. And then perform (ï°k mod No. of Jobs) +1 operation to each ï°k to get the corresponding operation sequence of a Particle. The fitness function produces the output as a makespan value for the corresponding operation sequence.

## Parameters taken

The following parameters are taken for the PSO and SA algorithms to execute the proposed algorithm to achieve the optimal solution.

## Value Taken

Swarm Size

200

C1

2.1

C2

2.1

Initial Temperature T

95000

## Algorithm

The proposed algorithm for solving JSSP is as follows

Initialize temperature T to a particular value.

Initialize the N number of Particles by generating nÃ-m real numbers for each Particle.

Find the Operation time sequence and Machine sequence for N Particles.

Find the makespan value for each and every Particle using the objective function and also find the minimum makespan value (best) among N makespan values.

Compare particle's fitness evaluation with particle's pbest. If current value is better than pbest, then set pbest value equal to the current value and the pbest particle equal to the current particle.

Compare pbest values of overall population and Set the minimum pbest value as gbest value.

Change the velocity and position of the particle according to equations (1) and (2), respectively:

If gbest Particle is not changed over a period of time

Find a new Particle using temperature.

Accept the new Particle as best with probability as exp-(âˆ†E/T), even though current position is worse. Here âˆ†E is the difference between current best Particle's makespan and new Particle's makespan value.

Reduce T.

Loop to step 3 until a criterion is met, usually a sufficiently optimal gbest or a maximum number of iterations.

Terminate if the maximum number of iterations is reached or optimal value is obtained. Belated

## RESULTS AND DISCUSSION

In this paper, we have used 21 instances that are taken from the OR-Library (Beasley, 1990) as benchmarks to test our new proposed algorithm. In this algorithm we have taken 200 Particles as the initial population and temperature T as 95000.

We have used the Intel Pentium Core 2 Duo 3GHz Processor and 2 GB RAM configuration system with Windows XP as the platform to run this algorithm and achieved the following results. The CPU time is the time to find the optimal schedule for the benchmark JSSPs from using the above configuration machine. The proposed work gives the optimal solution for most of the benchmark problems in considerable amount of CPU time.

## Table.5. Experimental Results

S No

Bench Mark Functions

Instance Name

Size

Best Known Value

Obtained value using Simple PSO

Goncalves et al. (2005)

Obtained Value using Simple PSO with SA

(Simple PSO with SA)

CPU Time

1

FT06

6Â Ã-Â 6

55

55

55

55

45094

2

LA01

10Â Ã-Â 5

666

669

666

666

10154246

3

LA02

10Â Ã-Â 5

655

710

655

655

9180187

4

LA03

10Â Ã-Â 5

597

656

597

597

9020672

5

LA04

10Â Ã-Â 5

590

643

590

590

8956266

6

LA05

10Â Ã-Â 5

593

593

593

593

3828

7

LA06

15Â Ã-Â 5

926

930

926

926

1327718

8

LA07

15Â Ã-Â 5

890

981

890

890

15407610

9

LA08

15Â Ã-Â 5

863

919

863

863

15424641

10

LA09

15Â Ã-Â 5

951

952

951

951

1823625

11

LA10

15Â Ã-Â 5

958

958

958

958

48875

12

LA11

20Â Ã-Â 5

1222

1259

1222

1222

7479891

13

LA12

20Â Ã-Â 5

1039

1081

1039

1039

25368484

14

LA13

20Â Ã-Â 5

1150

1196

1150

1150

24501969

15

LA14

20Â Ã-Â 5

1292

1292

1292

1292

18734

16

LA15

20Â Ã-Â 5

1207

1385

1207

1207

24505391

17

LA16

10Â Ã-Â 10

945

1068

945

945

25145079

18

LA17

10Â Ã-Â 10

784

892

784

784

23979688

19

LA18

10Â Ã-Â 10

848

954

848

848

24118422

20

LA19

10Â Ã-Â 10

842

1015

842

842

24984847

21

LA20

10Â Ã-Â 10

902

1019

907

902

35645488

Our proposed algorithm generates the initial particles using random functions and by using the hybrid PSO and SA technique the particles are scattered over the solution space. The proposed algorithm gives more diversity of the particles which leads to the solution in the minimum acceptable CPU time. The CPU times taken to achieve the optimum results are high if the jobs and operations are more

The simple PSO algorithm gives the near optimum solution but it stagnates in the certain range. Diversification is the problem in PSO to achieve the solution. Our Proposed work gives the optimum results for the 21 instances. For the instance LA01 simple PSO gives 669 as the optimal solution but Simple PSO with SA method gives 666 as optimal solution.

Goncalves et al. (2005) gives optimal Solution for most of the benchmark problems. But our proposed algorithm gives the optimal solution within a minimum considerable amount of time. We can't compare the computation times of Goncalves et al. (2005) with our proposed work, because the system configuration is not unique.

Goncalves et al. (2005) gives 907 as a makespan for the Instance LA 20, but it is not an optimal solution. But our proposed algorithm gives the optimal solution (902). And also our proposed algorithm gives optimal solution compared with the Simple PSO algorithm for the 21 instances which is shown in the Table 5.

## CONCLUSION AND FUTURE WORK

In this paper, the hybridization of Particle Swarm Optimization with Simulated Annealing is used to solve the NP-hard JSSP. This algorithm adopts the real space as the search space and the Particle represents the permutation of all operations of all jobs. Combining PSO and Simulated Annealing can narrow the field of search and speed up the rate of convergence continually in the optimizing process. It has higher searching efficiency. It can also escape from the local minimums.

In the proposed algorithm, the problem representation is simple and easy when compared with other heuristic methods. Also this algorithm reduces the computational complexity. The experiment is conducted for population of size 200 Particles, and the optimal solution (Best Known Solution) in the search space is obtained for the benchmark problems with the above mentioned system configuration. In the proposed algorithm, it is assumed that each operation of a job is to be completed on a single machine.

The future works may be to find the methods to solve multi machine environment and the dynamic Job-Shop Scheduling Problem. Very high configuration machines are used to obtain the optimal solutions in the minimum time period. Grid based algorithms will be found to find the makespan of the Job-Shop Scheduling Problem. It will reduce the computational complexity and increase the performance and quality of the scheduling.