Friday, April 2, 2010

Control and path prediction of an Automate Guided Vehicle.

PREPARED BY: VIJAY PATEL 09CAD12

Control and path prediction of an Automate Guided Vehicle.









PREPARED BY: VIJAY PATEL 09CAD12













1.Introduction.

Automate Guided Vehicles (AGV) has been applied for the flexible manufacturing system. Many factories were adopted it into assembly line or production line such as automobile, food processing, wood working, and other factories. Many researchers developed and designed in order to suite with their applications which are related to the main problem of factory. Automate Guided Vehicle (AGV) has firstly developed and conducted the research by [17, 18, 19] in the attempt to using at Jumbo Truck Manufacturing in Thailand. On the past of developed AGV, we surveyed several papers concerned the design and control aspects as following. The different structures were proposed in several cases as

[1] proposed the architecture of AGV with two wheels
driven by differential gear drive and parallel linkage steering, and the design and operation was also presented by
[2]. This paper stated that the track layout and the number of AGVs in transportation control on a job-shop and a flow-shop were
determined by using the queuing network theory. For entire FMS system,
[3] proposed the operation control method by using two AGVs system. They solved the problem in scheduling method of AGVs model based on Petri nets. The formulation and heuristic search were used by global search in order to seek the optimal operation of the entire FMS. The operations of AGVs choice of guided path selection problem in FMS system was proposed by
[4]. They proposed an approach for material flow modelling basedon mathematical optimization method. With this approach, they obtained the guide path layout design with wire guided vehicles.
The objective of optimization model is the minimization of the total distance travelled by vehicles to transport the material handing system. The route planning of AGVs in FMS was proposed by
[5] presented the new approach for dynamics route
planning and scheduling problem of AGVs. They applied the search algorithm and some heuristic rules to solve the route assignment in dynamic situations.
[6] also proposed the path planning strategy of AGV to navigation, collision avoidance and
docking to the target. The path planning was implemented onboard computer in order to avoid the wire-guided path. Not only the AGV was moved along the path with collision avoidance, but also it should be navigated with no deadlock condition as done by
[7]. The AGV control approach was the important part for
controlling the AGV actions.
[8]. They formulated the control algorithm by digraph method in
real-time path assignment to the vehicles. The deadlock control of
AGV was controlled by colored resource-oriented Petri net model
method to deal with the conflict free in real-time control as shown
[9] applied the variable structure system techniques. The AGV was modeled by using kinematics
and dynamic system. Sliding mode control by using Lyapunov
design was applied for eliminating the chattering. They only
implemented by simulation methods. The other paper proposed
the control of AGV by using fuzzy logic control as shown in
[10] The AGV was guided by photoelectric guide way. The
designed controller was the self-adjustment of control parameter
by fuzzy controller.
[11] proposed the steering control of AGV using fuzzy control. The AGV was guided by guide tape. They showed the response and energy saving in case of step change of guide tape. Fuzzy controller was achieved the reduction of steering energy more than the PI controller.
[12] was presented the tracking algorithm of AGV navigation in container terminal.The multiple model algorithm based on multiple sensor detection
in order to detect obstacle or other AGVs. Unscented Kalman
filter was used to localization of AGV. They verified the propose
algorithm by simulation methods. The adaptive control of AGV is
also proposed by
[13]. The nonlinear of dynamic model wasdeveloped for motion generation. The propose control was based on Lyapunov concept to ensure the control of AGV even if the dynamic parameter was not perfect. The intelligent of AGV was also worked on several methods. The integrate sensor and vision
was applied for control AGV.
[14] studied the intelligent path following and control for vision-based automated guided vehicle.They presented the control path following of AGV by vision control system, and multi-sensors was also applied in real time
steering control. The hough transform algorithm was applied to
detect the guideline of path as shown by
[15]. The guideline of path was recognized by optical sensor as proposed by
[16]. the array of optical sensor with 14 infrared (IR) emitter-detector pairs arranged in two columns. The trajectory recognition was based on neural networks.







Figure-0 Mini-AGV, 60x15 cm
2. System architecture.

2.1. AGV design.

It is a three wheels vehicle as shown in Fig.1. The front wheel is used for driving and steering the AGV and the two rear wheels are free. The steering and driving are DC motor. Two encoders are individually attached on the two rear wheels in order.

to measure the vehicle displacement and then calculate its real time position and orientation. The choice of positioning the encoders on the free wheels provides to the vehicle an accurate
measurement of its progression. A programmable logic control
(PLC) is used for motion control.

2.2. Control structure.
AGV( automated guide vehicle) working are three structure, here three structure are define.
1) PLC control structure.
2) Camera control structure.
3) Sensor control structure.

2.2.1 Plc Control structure.
The parameters of the motion are driving speed and steering
angle which determine the evolution of the position and
orientation of the AGV. The input and output signal are interfaced
with PLC module. The inputs are the encoder signal from left and
right rear wheels. The driving speed and steering angle are
calculated form these inputs and the digital output is converted to
analog signal to drive amplifier of the driving motor and steering
motor on front wheel as shown in Fig. 2







A). INTEGRATED PLC CONTROL WITH WIRELESS CAMERA OF AGV SYSTEM.

i) SYSTEM ARCHITECURE DESCRIPTION.

Figure-3

The AGV prototype design is based on existing JUMBO industrial truck as shown Figure 3 . It is a three wheels vehicle as shown in Figure 1. The front wheel is used for driving and steering the AGV and the two rear wheels are free. The steering and driving are DC motor. Two encoders are individually attached on the two rear wheels inorder to measure the vehicle displacement and then calculate its real time position and orientation. The choice of
positioning the encoders on the free wheels provides to the vehicle an accurate measurement of its progression. A
programmable logic control (PLC) is used for motion control. The parameters of the motion are driving speed and steering angle which determine the evolution of the position and orientation of the AGV. The input and output signal are interfaced with PLC module. The inputs are the encoder signal from left and right rear wheels. The driving speed and steering angle are calculated form these inputs and the digital output is converted to analog signal to drive amplifier of the driving motor and steering motor on front wheel as shown in Figure 2. Figure 4 shown c
ommunication system. Modbus protocol is selected for communication structure between PC in operator site and PLC on AGV remote site of operation. Mater-slave parameters are obtained the configuration of communication protocol on PC in remote operation through RS-232 standard port. Mater will send command query to slave and the salve will answer back to the master then the communication will takes place.


Figure-4 AGV Wireless Communication.







ii) CONTROL DESIGN SYSTEM.
The deviation error being evaluated, the steering and driving command signal can be calculated and converted to analog signal by the PLC. The steering and driving control strategy are showed by the to simple block diagram figure 5.The correction applied to the command signal is a proportional one for the driving signal and proportional derivative for the steering signal.





The control algorithm of the AGV has been implemented by using PLC TSX micro form SCHNEIDER. The implemented program is written by PL7 Pro using Grafcet and structured text language. The main inputs of the PLC are high speed up and down counter connected to the 2 encoders. The outputs of steering and driving command are converted to analog output ranged by 0-5 V. The grafcet loop executes 3 consecutive tasks. Control loop is executed every 5 ms.

iii) INTERGRATED VISION SYSTEM.

Two modes of operation are developed with AGV control function: automatic and manual modes of operation. In this section, we describe the manual control operation with wireless camera equipped in front of AGV by fixed-point position in pose of look-ahead structure of visual control structure as illustrated in Figure 6 a). AV-receiver module uses for send and receive the audio video signal transmitted in radio frequency range with maximum distance 100 meters as illustrated in Figure 6 b). Human can view the environment and control AGV during its movement in manual
mode of operation.



iv) SIMULATION AND EXPERIMENTS.

In control design system, DC motors are applied for driving system. There are two axes of AGV control system. One is
for driving axis and the other is for steering axis. Positioning control of AGV is needed for control displacement and steering angle according to path of generation command. In this work, we design the control structure with Mathlab by using PD controller as depicted in Figure 7, since control gains are obtained as Kp equals to 27.5 and Kd is equals to 5.5 with response time 1.8 second with no steady-state error as shown control the performance in Figure 8.




v) Experiment Results.

Command window illustrated in Figure 9 uses for control AGV movement operation through PC in long distance area.There are two types of command such as positioning command (x,yposition) and jog mode command which is tested motion control in each axis such as go, turn left, turn right etc. In positioning command, set of x,y coordinated position are sent to AGV with the design path of movement. Experiments are conducted in several tests, for example, Figure 10 is shown the design path with S-curve shape of AGV moving on PC. The position pairs are sent to AGV with wireless communication channel. AGV receives the command of two axes, steering and driving axes, to control AGV with regarding to specified path. The result of AGV moving with the specified path have is shown in Figure 11.





2.2.2. Camera Control structure.

In this article we will discuss a machine vision system that uses one on-board camera to guide AGV to pick up a pallet. Furthermore, we will
explain some of the challenges that need to be taken into account in a project like this.


Figure 12. camera for AGV.

Figure 13.


 Closed loop control.
In order to guide AGV to its target a closed loop control system is needed, as seen in Figure 13. The control system consists of a sensor, a controller,and an actuator, which changes the position of the
machine.In this case the sensor contains a machine vision
application that detects the position of the pallet and returns the position in real world units to the controller. Controller calculates the steering
commands for the actuator, which mechanicallymoves AGV closer to the pallet and finally picks it up.This article will concentrate to the sensor and the technical challenges related to its implementation.

 The sensor.
Sensor’s task is to detect the pallet from the environment without any artificial cues. The sensor consists of a camera, which is looking forward
from AGV and a computer that processes the images from the camera. This processing consists of a pattern recognition algorithm.There are at least two ways to do the pattern recognition: gray value based correlation and edge based matching. Edge based matching is used in this application since it is more robust against illumination variations.

 Edge based matching.
In order to use edge based matching, the shape to be matched has to be well known. In this case the shape is one side of the pallet, as seen in Figure 14.Furthermore, the assumed size of the pallet on the image needs to be known or otherwise the search willconsume too much time and AGV cannot be controlled in real-time. This approach assumes the visible shape has sharp edges, which typically is the case with the pallets.


Figure 13. Figure 3. Shape of a pallet found with edge based matching.

 Edge based matching problems.
It is hard to find a shape from an image that has a lot of false
edges. These kinds of false edges are for example shade edges created by sunlight. In addition, some of the pallets contain rounded edges, which cannot be found from the image easily. The false edges are a major problem in this kind of application since the pallets are in naturally illuminated environment, or the environment itself contains a lot of other edges. One way to fight this problem is to use additional information to help the pattern recognition algorithm. In this application depth information is used.

 Shape separation with motion vectors.
Motion vector based shape separation is based onoptical flow algorithm, which calculates the motion vectors from consecutive video frames. The motion vectors illustrate the movement in the image i.e. where each part of the image has moved in the consecutive frames. (See Figure 14.) This method can be applied to separate objects that are on different distances from the camera. A requirement is that the objects stay still and the camera moves.

Figure 14.
In order to calculate the motion vectors precisely all the surfaces should have a texture. In addition the reflections that appear to move with the camera should be minimized. The calculation itself is resource intensive so a fast calculation platform is needed to make use of this method in real-time.

 Image processing.

i). Vision for Automatic guide Vehicles.
Automatic vehicles (AGV) are used to perform routine tasks for industry,as well as being used in areas hazardous to humans. Machine vision can provide such vehicles with 'sight', allowing them to understand their surroundings and leading to more flexible use of AGV. Stereo vision enables AGVs to have a 3-dimensional (3D) understanding of their environment (Fig 14.1). This can be used to perform free space mapping (FSM). FSM allows an AGV to find clear paths between obstacles.






ii) Detecting Obstructions












Figure 15. Example of how to apply image processing to detect obstructions

Firstly the ground plane (GP) is identified by fitting a plane through objects lying in it, e.g. floor markings. Edge detection identifies features in the AGV field of vision (Fig 14.2).Edges that are not in the GP are assumed to belong to objects that extend to the GP, and thus would obstruct the AGV movements (Fig 14.3). This approach enables 3D objects to be distinguished from features like floor markings. Three-dimensional scene edges, derived using stereovision, can also be used for vehicle navigation, and for the location and tracking of known objects (Fig 14.4).








ii) Exploring New Surroundings .













Figure 15. Super-imposing a Cartesian grid in Automatic Guided Vehicle .


autonomous guided vehicle (AGV) enters an unknown area, it must be
able to understand its surroundings, as it proceeds. One approach to achieving this, involves analysing the images received from a camera placed at the front of the AGV. Firstly, features such as corners (T-junctions or Y-junctions) are located as the AGV moves through the scene. Such features are chosen, because the point of intersection of the lines which form a corner is fixed regardless of the angle from which it is viewed. The features identified are then tracked in the series of
images. As the AGV moves, the apparent motion of features in its field of view will depend on their distance. In fact from the trajectories of features in the image, their 3D positions, and the 3D motion of the AGV, can be estimated. By superimposing a Cartesian grid on the image (see Fig 15), a drive able region can be defined.

iii). AGV Surveillance.





Figure16. View of Monitor in AGV Surveillance

AGV have mechanisms to locate their position relative to their
environment. Correct calculation of the AGV position is essential for safe and effective performance of tasks. It is therefore necessary to monitor the movements of an AGV and correlate this information with the AGV own estimate of its position. One such system uses four fixed cameras to survey a workspace. The image from each camera is used to identify objects by subtracting the received image from a reference image of the empty workspace. The positions of the cameras are calibrated so that the positions of the objects on the floor can be determined from their positions in the image. The data from the four cameras is fused to achieve precise location. Objects, such as people, can be distinguished from an AGV by using models which describe characteristic features.

2.2.3 Sensor control structure.
Figure17

 A key requirement for autonomous navigation in an unconstrained and uncertain environment is that the systembe capable of sensing the surroundings to determine where the AGV is at present (localization) and where it is moving. This is in order for the AGV to be able to respond intelligently to a changing situation or environment. As is the case with many autonomous navigation systems, all the sensors are placed on board the vehicle. Sensory modules on board the actual vehicle include optical encoders, rate sensors, accelerometers, gyroscope, compass, DGPS, CCD color camera, laser scanner, sonar and proximity sensors. The sensors provide complementary information as regards the internal state of the vehicle and the current state of the
environment. Therefore, a proper sensor fusion algorithm can be developed in order to find localization and obstacle information. The vision system is used for the purposes of localization and local navigation. For the purpose of outdoor navigation, two novel lane detection algorithms have been developed. In one approach, the lane edges are detected directly on camera image and then those edges are converted into real world coordinates (see Figure 18). Catmull-Rom spline-based lane model and a free-form snake-based (FFSB) algorithm [7], which describes the perspective effect of parallel lines, has
been developed. In the other approach, it is assumed that the lanes are on a flat ground and first; the camera image is projected on to the ground image and then detection of the edges on the ground image. In this algorithm, deformable templates are used and circular arc is applied to describe the lane shape with a limited range. Differential Global Positioning Systems (DGPS) are gaining widespread popularity as a navigational aid, especially in providing absolute position information. We developed a position estimation system by utilizing information from different sensors, viz. DGPS, rate gyroscope and odometers
via extended Kalman filtering technique.


Figure 18:

Navigation Module.
 The navigation module is implemented using a behavioural approach. That is, each of the complex local navigational tasks that need to be carried out is analyzed in terms of primitive behaviors and expressed as an aggregation of such behaviors. A fuzzy logic approach to behavior synthesis and integration has been adopted. The fuzzy behavioral methodology provides a natural means of incorporating human navigation skills in terms of linguistic information. The fuzzy behaviors which are considered to be necessary include, Wall-Curb Following, Obstacles Avoidance, Obstacles Contouring, Narrow Path Maneuver,
Cornering, Route Following and Wandering. Each of the behavior is synthesized based on appropriately fused sensory data received from the complementary sensor devices. The traveling profile of the AGV performing a left curb following and right turn at a junction is as shown in Figure 19

Figure-19
 A novel behavior fusion method for the navigation of AGV in unknown environments also investigated. The proposed navigator consists of an Obstacle Avoider (OA), a Goal Seeker (GS) and a Navigation Supervisor (NS). The fuzzy actions inferred by the OA and the GS are weighted by the NS using the local and global environmental information and fused through fuzzy set operation to produce a command action, from which the final crisp action is determined by defuzzification. Simulation shows that the navigator is able to perform successful navigation tasks in various unknown environments, and it has smooth action and exceptionally good robustness to sensor noise.


CONCLUSION.

1) PLC control structure.
 This article presents a control of AGV by integrated wireless camera. Overall structure for designing AGV is described. Control of AGV motion is implemented by using PD control scheme. Driving axis and steering axis are separated to implement the motion control. In this research, we equipped wireless camera for controlling AGV in remote area. Position and orientation movements are estimated and control by human eyes for manual mode. Position and orientation of AGV are measured the motion control system. Simulation by using Matlab/Simulink software is used for verifying the design control parameters of AGV. Result of experiment is tested for example with specified design Scurve
shape of AGV movement. We conclude that the vehicle can reach from the stating position to the target position with accurate location and can control AGV with fast scanned environment by using wireless camera for investigation and the freedom of operation. Future work is planed to increase the automatic agent by extracting the observation technique by image processing unit. Treatment of dynamic model of vehicle is also planed to the next step.

2). Camera control structure.
 In order to make this kind of application more robust, several image modalities can be used. In this case a gray scale image and a depth image
were used. These modalities assist each other since the pattern recognition fails in different situations when the gray scale image is used than when the depth image is used. The pattern recognition algorithms are not a silver bullet to all machine vision recognition cases.
Deep image processing knowledge is needed in order to know when and how these algorithms can be used and with what boundary conditions.
When the application is not trivial, as the one discussed here, it is better to turn to a machine vision expert.
 Implementing a camera-based Local Positioning System® (LPS) has established its potential as an essential remote supervisor of multiple semi-autonomous AGV, scaled-down within the concept of a large indoor MiniWorld laboratory. With minimal changes in the AgileFrames concepts of individual and fleet control, the extension of the supervisory and localisation task in the real big world would indeed assume
our LPS® substituted by the general free-field modalities offered by GPS, the satellite-based Global Positioning System. In the real world, notably the public domain with a mix of non-systemrelated actors, the semi-autonomous vehicles could additional to GPS incorporate obstacle detection and ranging.



3)Sensor control structure.
 There are many factors that need to be taken into account in the development of an autonomous navigation system foroutdoor AGVs. Among them, the design of the vehicle structure, computer architecture, actuation system, it’s control and utilization of various sensors play an important role. Further, it is important that the autonomous navigation system be intelligent to act both reactively and proactively to the changing environmental conditions. The realization of such an architecture involves the development of subsystems for sensing, path planning, localization, local navigation, and path control. A judicious choice of good sensors for determining the internal state of the AGV and the state of the
environment is of paramount importance. This is, especially in an outdoor environment, due to many factors such as weather, changing environmental conditions, road conditions, etc., a few sensors acting independently may not be adequate. Thus, an array of sensors needs to be used with appropriate sensor fusing technology. Advanced non-linear
control methods are a necessity in the controller development as AGV kinematic and dynamic models are extremely non-linear and complex. A local navigation system based on a behaviorist decomposition of tasks is
highly suited given the complexity of the problem of local navigation in the presence of environmental uncertainty and
modeling difficulties.


 Other module through we can control AGV speed and position,
For example given belove,

VEHICLE AND CONTROLLER MODULE.
Autmatic control is an important function of the AGV. The main function of the control system is to track a desired path specified by the local navigation system as accurately as possible. The problem of control is compounded by the fact that the model dynamics are highly non-linear and of a higher order. Further, model parameters are either unknown or uncertain, and also subject to variation due to changing
load and environmental conditions. Detailed and accurate kinematic and dynamic models of the non-holonomic AGV were developed for the purposes of designing and testing of suitable controllers and navigational strategies. Despite the model complexity of the AGV, linear PD/PID control methods have been applied yielding adequate performance at slow speeds. To enhance performance, several non-linear methods are being investigated, including input-output feedback linearization and the sliding mode control. Considering the complexity of the vehicle dynamics and high non-linearity, the difficulty of obtaining the actual vehicle dynamic parameter values, the variability of certain model parameters and the human knowledge available on
speed and steering control, fuzzy control methods are also being investigated. The experimental results of such a fuzzy control scheme, which consists of a Lateral Fuzzy Controller (LAFC) and a Fuzzy Drive Controller (FDC) is as
shown in Figure 20.


figure 20:


REFERENCES.

1).PLC control structure.
1. Butdee, S., and Suebsomran, A. (2007). Localization Based on Matching Location of AGV, Proceeding of the 24th
International Manufacturing Conference, IMC24, Waterford Institute of Technology, Ireland, pp. 1121-1128.
2. Butdee, S., and Suebsomran, A. (2006). Learning and Recognition Algorithm of Intelligent AGV System,
GCMM2006, Santos, Brazil, pp.13-72.
3. Butdee, S., Vignat F., and Suebsomran, A. (2006). Self - alignment Control of an Automated Unguided Vehicle,
IDMME06, Grenoble, France.
4. Tomita, K., and Tsugawa, S. (1994). Visual Navigation of a Vehicle Along Roads: The Algorithm and
Experiments, IEEE 1994 Vehicle & Navigation Information Systems Conference Proceedings, pp. 419-424.
5. Seelinger, M, and Yoder, J-D. (2005). Automatic Pallet Engagement by a Vision Guided Forklift, Proceedings of
the 2005 IEEE International Conference on Robotics and Automation , Barcelona, Spain, pp. 4068-4073.
6. Kay G. M., and Lud R. C., (1993). Global Vision for the Control of Free-Ranging AGV Systems, Proceedings of
the 1993 IEEE International Conference on Robotics and Automation, Atlanta, Georgia, Volume 2 , pp. 14-19.
7. Fukuda T., Yokoyama Y., Abe Y., and Tanaka K. (1996). Navigation System Based on Ceiling Landmark
Recognition for Autonomous Mobile Robot- Position / Orientation Control by Landmark Recognition with Plus
and Minus Primitives-, Proceedings of the 1996 IEEE International Conference on Robotics and Automation ,
Minneapolis, Minnesota, pp. 1720-1725.
8. Hayakawa Y., White R., Kimura T., and Naito G. (2004). Driver-Compatible Steering System for Wide Speed-
Range Path Following, IEEE/ASME Transactions on Mechatronics, 9,: 544-552.

2). Camera control structure.
1. Evers, J.J.M., L. Loeve, D.G. Lindeijer (2000). The service-oriented agile logistic control
and engineering system: SERVICES. Logistic Information management, Vol.13, No.2.
2. Furnée, E.H. (1967). Hybrid instrumentation in prosthetics research. In Proc. 7th Int.Conf.
on Medical and Biological Engineering, Stockholm 1967, p.446.
3. Furnée, E.H., A. Jobbágy, J.C. Sabel, H.L.J. van Veenendaal, F. Martin, D.G.W.C.
4. Andriessen (1997). Marker-referred movement measurement with grey-scale coordinate
extraction for high-resolution real-time 3-D at 100 Hz. In (J. Walton ed.) Proc. of SPIE,
Volume 3173, pp 357-369.
5. Lindeijer, D.G. (2000). The new logistics is Integrated, Communicative and Real-time. In
(P.H.L. Bovy, ed.) Proc. TRAIL 6th Annual Congress "Transport, Infrastructure and
Logistics, the Hague, dec. 12, 2000 pp 87-108.
6. Van Baarlen, J.F. (2001). The motion capture system PRIMASNT: Hardware, software and
applications. MSc Thesis, Faculty of Applied Physics (section Signals, Systems and
Control), Delft University of Technology, april 2001, 124 p.

3)Sensor control structure.
1. C. C. Chan, “An Overview of Electric Vehicle
Technology”, Procs of the IEEE, Vol.81, No.9, Sept. 1993,
pp 1202-1213.
2. Y. K. Tham, Sensor Fusion for Four-Wheel Steerable
Industrial Vehicles, MEng Thesis, School of EEE, NTU,
1999.
3. C. T. Goh, Position Estimation of Autonomous Guided
Vehicle, MEng Thesis, School of EEE, NTU, 1999.
4. Y. Wang, A Novel Vision-Based Lane Detection and
Tracking Algorithm using B-Snake, MEng Thesis, School
of EEE, NTU, 1999.
5. Y. Wang, D. G. Shen and E. K. Teoh, “Lane Detection
using Spline Model”, to appear in Pattern Recognition
Letter, 2000, Netherlands.
6. L. Y. Chen, Z. K. Lu and E. K. Teoh, “A novel Affine
Invariant Feature set and its Applications in Motion
Estimation”, Procs ICIP-2000, Sept. 10-13, 2000, Canada.
7. Z. K. Lu and E. K. Teoh, “A Free form Snake-Based
Algorithm for Lane Detection”, Procs MVA -2000, Nov.
28-30, 2000, Japan.
8. Z. Xue, D. G. Shen and E. K. Teoh, “An Efficient Fuzzy
Algorithm for Aligning Shapes under Affine
Transformation”, to appear in Pattern Recognition, 2000.

No comments:

Post a Comment