US20190027034A1 - Variable steering error limits for automated vehicle control - Google Patents

Variable steering error limits for automated vehicle control Download PDF

Info

Publication number
US20190027034A1
US20190027034A1 US15/653,879 US201715653879A US2019027034A1 US 20190027034 A1 US20190027034 A1 US 20190027034A1 US 201715653879 A US201715653879 A US 201715653879A US 2019027034 A1 US2019027034 A1 US 2019027034A1
Authority
US
United States
Prior art keywords
lane
vehicle
set forth
host vehicle
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/653,879
Inventor
Wenda Xu
Junqing Wei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motional AD LLC
Original Assignee
Aptiv Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aptiv Technologies Ltd filed Critical Aptiv Technologies Ltd
Priority to US15/653,879 priority Critical patent/US20190027034A1/en
Assigned to DELPHI TECHNOLOGIES, INC. reassignment DELPHI TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WEI, JUNQING, XU, Wenda
Priority to PCT/US2018/042465 priority patent/WO2019018378A1/en
Assigned to APTIV TECHNOLOGIES LIMITED reassignment APTIV TECHNOLOGIES LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DELPHI TECHNOLOGIES INC.
Publication of US20190027034A1 publication Critical patent/US20190027034A1/en
Assigned to MOTIONAL AD LLC reassignment MOTIONAL AD LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APTIV TECHNOLOGIES LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • G06K9/00805
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries

Definitions

  • a control feature of such modern vehicles may cause the vehicle to drive in the center of a lane regardless of the existence of objects to the left or right of the lane.
  • An object may be a vehicle parked at the side of the road, although outside of the lane. For the vehicle to remain at the center of the lane, the vehicle may pass the object closely.
  • a system for semi-autonomous or autonomous operation of a host vehicle includes an object detector and a controller.
  • the object detector is configured to detect an object proximate to a lane boundary and output an object signal.
  • the controller is configured to process the object signal and direct the host vehicle away from the lane boundary upon detection of the object.
  • an autonomous vehicle in another, non-limiting, embodiment, includes a controller and a steering unit.
  • the controller includes a processor and an electronic storage medium.
  • the steering unit is constructed and arranged to receive a steering command from the controller for moving the autonomous vehicle from a centered position to a biased position upon receipt of an object detected signal.
  • a computer software product is executed by a controller of an automated vehicle configured to receive object and lane positioning signals to control lane positioning of the automated vehicle based on the detection of an object proximate to a lane boundary.
  • the computer software product includes an object module and a lane positioning module.
  • the object module is configured to receive the object signal and determine if the object is proximate to the lane boundary.
  • the lane positioning module configured to receive the lane positioning signal to determine lane positioning of the automated vehicle, and at least in-part effectuate movement of the automated vehicle from a centered position to a biased position that is away from the lane boundary if the object module determines that the object is proximate to the lane boundary.
  • FIG. 1 is a top view of a multi-lane roadway traveled by an automated vehicle equipped with a system to detect an object proximate to a lane boundary in accordance with the present invention
  • FIG. 2 is a schematic of the automated vehicle with the system in accordance with the invention.
  • FIG. 1 illustrates a non-limiting example of a variable steering error limit system, hereafter the system 20 , for semi-autonomous, or autonomous operation of a host vehicle 22 .
  • the host vehicle 22 may be a fully automated vehicle (i.e., autonomous vehicle).
  • the system 20 may control the speed, direction (e.g., steering), brakes, and other aspects of the host vehicle operation necessary for the host vehicle 22 to travel in a lane 24 of a roadway 26 without interaction from an occupant, or operator 28 (see FIG. 2 ) situated within the host vehicle 22 .
  • the roadway 26 may have multiple lanes 24 with the lanes being defined between two lane boundaries 30 , 32 .
  • the lane boundaries 30 , 32 may be represented by painted markings on the roadway 26 , by curbs, or by a combination of both.
  • the host vehicle 22 may be driven by the operator 28 .
  • the system 20 may provide assistance to the operator 28 (i.e., a semi-autonomous vehicle).
  • This assistance may be the mere activation of a warning-device 34 , or may include activating a control override unit 35 that temporarily takes over the control of manual controls 36 that are used by the operator 28 and/or the system 20 .
  • Such manual controls 36 may include a directional unit 36 A (e.g., steering unit), an acceleration unit 36 B, and a braking unit 36 C of the host vehicle 22 .
  • the warning device 34 may include, or may be, an audible device 34 A, a visual device 34 B, and/or a haptic device 34 C.
  • the system 20 may include the warning device 34 , the control override unit 35 , the manual controls 36 , an object detector 38 , a lane positioning detector 40 , an adjacent vehicle detector 42 , a vehicle speed sensor 43 , a vehicle-to-vehicle (V2V) transmitter 44 , a V2V receiver 46 , and a controller 48 .
  • the controller 48 may include a processor 50 and an electronic storage medium 52 .
  • the processor 50 may be a microprocessor or other control circuitry such as analog and/or digital control circuitry including an application specific integrated circuit (ASIC) for processing data as is known by one with skill in the art.
  • ASIC application specific integrated circuit
  • the storage medium 52 of the controller 48 may be non-volatile memory, such as electrically erasable programmable read-only memory (EEPROM) for storing one or more routines, thresholds, and captured data, hereafter referred to as an application 54 (i.e., the computer software product).
  • the application 54 may be executed by the processor 50 of the controller 48 to perform steps for determining if various signals received from one or more of the detectors 38 , 40 , 42 indicate that an object exists, the object is being approached by the host vehicle 22 , and whether the host vehicle 22 should be in the center of the lane 24 , a biased position, or some other position.
  • the object detector 38 of the system 20 is configured to detect an object 56 that may be located outside of the lane 24 that the host vehicle 22 is traveling on, and proximate to one of the two defining boundaries 30 , 32 .
  • the object 56 may be initially forward of the host vehicle 22 , may be stationary, may be moving at a speed slower than the host vehicle 22 , or may be moving at a speed in a different direction than the host vehicle.
  • the object 56 is illustrated as a parked vehicle, other non-limiting examples may include a guard rail, traffic cones, a person, and others.
  • the object detector 38 may be, but is not limited to, one or more of a Light Detection and Ranging (LiDAR) device 38 A, a radar device 38 B, and/or an image capture device or camera 38 C.
  • LiDAR Light Detection and Ranging
  • Other devices suitable to detect an approaching object 56 such as a microphone and an ultrasonic transceiver are also contemplated. It is contemplated that two or more of these devices 38 A, 38 B, 38 C may cooperate to detect and classify the approaching object 56 while the host vehicle 22 is moving. For example, information from the radar device 38 B and the camera 38 C may be combined to reliably detect the object 56 ahead of the host vehicle 22 .
  • the LiDAR device 38 A may be preferable for determining that the object 56 ahead of the host vehicle 22 is, for example, a parked vehicle or a pedestrian standing proximate to the lane boundary 30 .
  • the object 56 ahead of the host vehicle 22 is, for example, a parked vehicle or a pedestrian standing proximate to the lane boundary 30 .
  • advancements in radar and image processing of images captured by the camera 38 C are expected, so those devices may be preferable in the future.
  • the object detector 38 is illustrated as being mounted at the front of the host vehicle 22 , it is contemplated that the various devices 38 A, 38 B, 38 C may be distributed and/or duplicated at various location about the host vehicle 22 .
  • the camera 38 C, or duplicates of the camera may be located rearward on the host vehicle 22 so that the lane boundaries 30 , 32 , and other boundaries of the roadway 26 can be detected.
  • the radar device 38 B, or duplicates of the radar device may be mounted at each corner of the host vehicle 22 so that, in addition to detecting the object 56 , an adjacent vehicle 58 in an adjacent lane 24 of the roadway 26 may be detected.
  • the lane positioning detector 40 of the system 20 is configured to determine a relative position (see arrow 60 in FIG. 1 ) of the host vehicle 22 in the lane 24 with respect to a lane centerline C, the boundaries 30 , 32 , or other markings/features of the roadway 26 .
  • the lane positioning detector 40 may be, or may include, an image capture device or camera 40 A, a geographic navigation device 40 B (e.g., global positioning system (GPS), and/or other devices configured to determine the vehicle position within the lane 24 .
  • GPS global positioning system
  • the lane positioning detector 40 is shown mounted at the front of the host vehicle 22 , but other locations such as on the roof of the host vehicle 22 , or within the occupant compartment and looking through the windshield of the host vehicle 22 are also contemplated.
  • the adjacent vehicle detector 42 of the system 20 is configured to determine a distance (see arrow 62 in FIG. 1 ), which may be a lateral distance, measured between the adjacent vehicle 58 and the host vehicle 22 .
  • the adjacent vehicle detector 42 may include, but is not limited to, one or more of a LiDAR device 42 A, a radar device 42 B, and/or an image capture device or camera 42 C.
  • Other devices suitable to detect an adjacent vehicle such as a microphone and an ultrasonic transceiver are also contemplated. It is contemplated that two or more of these devices 42 A, 42 B, 42 C may cooperate to detect and classify the adjacent vehicle 58 and measure distance 62 while the host vehicle 22 is moving.
  • information from the radar device 42 B and the camera 42 C may be combined to reliably detect the adjacent vehicle 58 and/or measure distance 62 ahead of the host vehicle 22 .
  • the LiDAR device 42 A may be preferable for determining that the adjacent vehicle 58 ahead of the host vehicle 22 .
  • advancements in radar and image processing of images captured by the camera 42 C are expected, so those devices may be preferable in the future.
  • the various devices 42 A, 42 B, 42 C may be distributed and/or duplicated at various locations about the host vehicle 22 .
  • the camera 42 C or duplicates of the camera, may be located rearward on the host vehicle 22 so that approaching adjacent vehicles 58 may be detected.
  • the radar device 42 B or duplicates of the radar device, may be mounted at each corner of the host vehicle 22 so that, in addition to detecting the adjacent vehicle 58 , objects 56 adjacent to the roadway 26 may be detected.
  • the detectors 38 , 40 , 42 may share various devices.
  • the camera 38 C may be, or may be capable of functioning as, the camera 42 C.
  • the camera of another detector may be used by the system 20 instead.
  • the V2V transmitter 44 may be configured to transmit a host signal 64 that may generally indicate a particular position of the host vehicle 22 with the lane 24 .
  • the V2V transmitter 44 may be, but is not limited to, a Dedicated Short Range Communications (DSRC) system that uses the known 802.11P communications protocol.
  • the V2V receiver 46 may be configured to receive an object signal 66 from the object 56 that may, for example, indicate the object (i.e., in the example of a vehicle) is parked.
  • the object signal 66 may also include GPS data or other location data that when received by the controller 48 of the system 20 , provide further information pertaining to the distance between the host vehicle 22 and the object 56 .
  • the V2V receiver may also be configured to receive an adjacent vehicle signal 68 from the adjacent vehicle 58 that may indicate the adjacent vehicle 58 is in a particular position relative to the lane 24 that the adjacent vehicle is traveling in.
  • the application 54 which may be stored in the electronic storage medium 52 of the controller 48 , may include an object module 70 , a lane positioning module 72 , and an adjacent vehicle module 74 .
  • the object detector 38 is configured to output an object signal 76 for processing by at least the object module 70 .
  • the lane positioning detector 40 is configured to output a lane positioning signal 78 for processing by at least the lane positioning module 72 .
  • the adjacent vehicle detector 42 is configured to output an adjacent vehicle signal 80 for processing by at least the adjacent vehicle module 74 .
  • the speed sensor 43 is configured to send a speed signal 81 to the controller 48 for processing by at least the adjacent vehicle module 74 to generally determine the initiation and rate of a vehicle position shift within the lane 24 .
  • the transmitter 44 is configured to receive a transmit command 82 from the controller 48 that commands the transmitter 44 to transmit the host signal 64 (see FIG. 1 ) indicative of the host vehicle 22 being in a particular position with respect to the lane boundaries 30 , 32 .
  • the receiver 46 is configured to receive various, external, signals, such as the object signal 66 and the adjacent vehicle signal 68 , and send a receiver signal 84 , indicative of at least signals 66 , 68 , to the controller 48 for processing.
  • the object detector 38 may send an object signal 76 , indicative of ‘no object detected’, to the controller 48 for utilization by the object module 70 of the application 54 .
  • the lane positioning detector 40 may be sending a lane positioning signal 78 to the controller 48 for utilization by the lane positioning module 72 of the application 54 . If the lane positioning signal 78 is indicative of the host vehicle 22 being in a centered position 86 (i.e., centered to centerline C or centered within the lane 24 , see FIG.
  • the lane positioning module 72 executed in unison with the object module 70 , may cause the host vehicle 22 to remain in the centered position 86 . If the host vehicle 22 is not in the centered position 86 , the positioning module 72 may cause the controller 48 to initiate a command signal 88 to the directional unit 36 A directing the host vehicle 22 to adjust toward the centered position 86 while moving in a forward direction. In one embodiment, the host vehicle 22 may remain centered to the lane 24 even when the adjacent vehicle detector 42 sends an adjacent vehicle signal 80 to the adjacent vehicle module 74 indicating an adjacent vehicle 58 is in the adjacent lane 24 .
  • the object detector 38 may detect an object 56 , thus may send an object signal 76 , indicative of at least ‘an upcoming object detected’, to the controller 48 for utilization by the object module 70 of the application 54 .
  • the lane positioning detector 40 may be sending a lane positioning signal 78 to the controller 48 for utilization by the lane positioning module 72 of the application 54 . If the lane positioning signal 78 is indicative of the host vehicle 22 being in the centered position 86 , the lane positioning module 72 , executed in unison with the object module 70 , may direct the controller 48 to send a command signal 88 to the directional unit 36 A directing the host vehicle 22 to adjust toward a biased position 90 (see FIG. 1 ).
  • the host vehicle 22 will shift away from the object 56 , thereby increasing the clearance between the object 56 and the passing host vehicle 22 . More specifically, if the object 56 is detected as being proximate to the lane boundary 32 , the biased position 90 will amount to a host vehicle shift toward the opposite lane boundary 30 , and vice-versa.
  • the system 20 may place the host vehicle 22 in the biased position 90 (i.e. offset position) generally at the moment the host vehicle 22 passes the object 56 . In this way, the system 20 creates a safe and/or comfortable distance between the host vehicle 22 and the object 56 at the moment of passing. Also, the lane positioning detector 40 provides the necessary data to the application 54 to assure this additional clearance is maintained while passing the object 56 .
  • this clearance may be about one meter ( 1 m ). In another example, the clearance may be a function of the width of the host vehicle 22 and a width of the lane 24 .
  • the lane positioning detector 40 may detect a forward distance between the detected object 56 and the forward moving host vehicle 22 . This forward distance along with the velocity of the host vehicle obtained from the speed signal 81 may be utilized by the lane positioning module 72 to determine when the movement from the centered position 86 to the biased position 90 should be initiated. This same data may also be used to determine the rate at which the host vehicle 22 moves from the centered position 86 to the biased position 90 .
  • the additional clearance created between the object 56 and the host vehicle 22 may be in anticipation of the object 56 inadvertently moving into or toward the lane 24 .
  • the application 54 executed by the processor 50 of the controller 48 may determine that the object 56 , via for example one of the cameras 38 C, 42 C is a parked vehicle. As a parked vehicle, and by providing the additional clearance, the host vehicle 22 is anticipating that a driver of the parked vehicle may open a door into the lane 24 without taking proper precautions.
  • the object 56 may be a person standing at, or besides, the lane boundary 32 that may be a curb.
  • the controller 48 may send a command 82 to the transmitter 44 directing the transmitter to transmit a signal 64 indicative of the host vehicle 22 being in the biased position 90 .
  • the signal 64 may be received by the adjacent vehicle 58 , if autonomous, causing the adjacent vehicle to also shift lane position.
  • the V2V receiver 46 may be configured to receive the object signal 66 from the object 56 .
  • the object signal 66 may include data indicating the position of the object relative to the roadway 26 .
  • the V2V receiver 46 may also be configured to receive an adjacent vehicle signal 68 from the adjacent vehicle 58 indicating that the adjacent vehicle 58 is in, or is going to enter into, a biased position 90 .
  • the land positioning module 72 may switch from a first set of left and right error limits to a second set of left and right error limits. All of the error limits may be pre-programmed into the controller 48 .
  • the left and right error limits of the first set may be substantially equivalent to one-another. With the error limits equivalent, the host vehicle 22 may be held substantially at the centered position 86 . In contrast, the left and right error limits of the second set may not be equivalent to one-another.
  • the second set is applied, the host vehicle adjusts, or shifts, toward the biased position 90 .
  • the second set of left and right error limits applied may have a right error limit that is larger than a left error limit.
  • a ‘direct’ command to move from the centered position 86 to the biased position 90 need not be sent from the controller 48 to the directional unit 36 A.
  • the system 20 for automated operation of the host vehicle 22 advances the automated vehicle arts by enabling a system, application, or controller to determine when a host vehicle should move from a centered position 86 within a lane 24 to a biased or off-centered position 90 to create a comfortable distance between an object 56 that may be stationary and the host vehicle 22 .
  • Computer readable program codes may include source codes, object codes, executable codes, and others.
  • Computer readable mediums may be any type of media capable of being accessed by a computer, and may include Read Only Memory (ROM), Random Access Memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or other forms.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CD compact disc
  • DVD digital video disc
  • an application may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. It is understood that an application running on a server and the server, may be a component.
  • One or more applications may reside within a process and/or thread of execution and an application may be localized on one computer and/or distributed between two or more computers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system for semi-autonomous, or autonomous, operation of a host vehicle includes an object detector and a controller. The object detector is configured to detect an object proximate to a lane boundary and output an object signal. The controller is configured to process the object signal and direct the host vehicle away from the lane boundary upon detection of the object.

Description

    BACKGROUND
  • The operation of modern vehicles is becoming increasingly autonomous, causing a decrease in driver intervention. A control feature of such modern vehicles may cause the vehicle to drive in the center of a lane regardless of the existence of objects to the left or right of the lane. One example of such an object may be a vehicle parked at the side of the road, although outside of the lane. For the vehicle to remain at the center of the lane, the vehicle may pass the object closely.
  • SUMMARY
  • In one, non-limiting, exemplary embodiment of the present disclosure, a system for semi-autonomous or autonomous operation of a host vehicle includes an object detector and a controller. The object detector is configured to detect an object proximate to a lane boundary and output an object signal. The controller is configured to process the object signal and direct the host vehicle away from the lane boundary upon detection of the object.
  • In another, non-limiting, embodiment, an autonomous vehicle includes a controller and a steering unit. The controller includes a processor and an electronic storage medium. The steering unit is constructed and arranged to receive a steering command from the controller for moving the autonomous vehicle from a centered position to a biased position upon receipt of an object detected signal.
  • In another, non-limiting, embodiment, a computer software product is executed by a controller of an automated vehicle configured to receive object and lane positioning signals to control lane positioning of the automated vehicle based on the detection of an object proximate to a lane boundary. The computer software product includes an object module and a lane positioning module. The object module is configured to receive the object signal and determine if the object is proximate to the lane boundary. The lane positioning module configured to receive the lane positioning signal to determine lane positioning of the automated vehicle, and at least in-part effectuate movement of the automated vehicle from a centered position to a biased position that is away from the lane boundary if the object module determines that the object is proximate to the lane boundary.
  • These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
  • FIG. 1 is a top view of a multi-lane roadway traveled by an automated vehicle equipped with a system to detect an object proximate to a lane boundary in accordance with the present invention; and
  • FIG. 2 is a schematic of the automated vehicle with the system in accordance with the invention.
  • DETAILED DESCRIPTION
  • FIG. 1 illustrates a non-limiting example of a variable steering error limit system, hereafter the system 20, for semi-autonomous, or autonomous operation of a host vehicle 22. The host vehicle 22 may be a fully automated vehicle (i.e., autonomous vehicle). As part of a fully automated vehicle, the system 20 may control the speed, direction (e.g., steering), brakes, and other aspects of the host vehicle operation necessary for the host vehicle 22 to travel in a lane 24 of a roadway 26 without interaction from an occupant, or operator 28 (see FIG. 2) situated within the host vehicle 22. The roadway 26 may have multiple lanes 24 with the lanes being defined between two lane boundaries 30, 32. The lane boundaries 30, 32 may be represented by painted markings on the roadway 26, by curbs, or by a combination of both.
  • Referring to FIGS. 1 and 2, and in another application, the host vehicle 22 may be driven by the operator 28. In this case, the system 20 may provide assistance to the operator 28 (i.e., a semi-autonomous vehicle). This assistance may be the mere activation of a warning-device 34, or may include activating a control override unit 35 that temporarily takes over the control of manual controls 36 that are used by the operator 28 and/or the system 20. Such manual controls 36 may include a directional unit 36A (e.g., steering unit), an acceleration unit 36B, and a braking unit 36C of the host vehicle 22. The warning device 34 may include, or may be, an audible device 34A, a visual device 34B, and/or a haptic device 34C.
  • The system 20 may include the warning device 34, the control override unit 35, the manual controls 36, an object detector 38, a lane positioning detector 40, an adjacent vehicle detector 42, a vehicle speed sensor 43, a vehicle-to-vehicle (V2V) transmitter 44, a V2V receiver 46, and a controller 48. The controller 48 may include a processor 50 and an electronic storage medium 52. The processor 50 may be a microprocessor or other control circuitry such as analog and/or digital control circuitry including an application specific integrated circuit (ASIC) for processing data as is known by one with skill in the art. The storage medium 52 of the controller 48 may be non-volatile memory, such as electrically erasable programmable read-only memory (EEPROM) for storing one or more routines, thresholds, and captured data, hereafter referred to as an application 54 (i.e., the computer software product). The application 54 may be executed by the processor 50 of the controller 48 to perform steps for determining if various signals received from one or more of the detectors 38, 40, 42 indicate that an object exists, the object is being approached by the host vehicle 22, and whether the host vehicle 22 should be in the center of the lane 24, a biased position, or some other position.
  • The object detector 38 of the system 20 is configured to detect an object 56 that may be located outside of the lane 24 that the host vehicle 22 is traveling on, and proximate to one of the two defining boundaries 30, 32. The object 56 may be initially forward of the host vehicle 22, may be stationary, may be moving at a speed slower than the host vehicle 22, or may be moving at a speed in a different direction than the host vehicle. Although the object 56 is illustrated as a parked vehicle, other non-limiting examples may include a guard rail, traffic cones, a person, and others.
  • The object detector 38 may be, but is not limited to, one or more of a Light Detection and Ranging (LiDAR) device 38A, a radar device 38B, and/or an image capture device or camera 38C. Other devices suitable to detect an approaching object 56 such as a microphone and an ultrasonic transceiver are also contemplated. It is contemplated that two or more of these devices 38A, 38B, 38C may cooperate to detect and classify the approaching object 56 while the host vehicle 22 is moving. For example, information from the radar device 38B and the camera 38C may be combined to reliably detect the object 56 ahead of the host vehicle 22. In one embodiment, the LiDAR device 38A may be preferable for determining that the object 56 ahead of the host vehicle 22 is, for example, a parked vehicle or a pedestrian standing proximate to the lane boundary 30. However, advancements in radar and image processing of images captured by the camera 38C are expected, so those devices may be preferable in the future.
  • While the object detector 38 is illustrated as being mounted at the front of the host vehicle 22, it is contemplated that the various devices 38A, 38B, 38C may be distributed and/or duplicated at various location about the host vehicle 22. for example, the camera 38C, or duplicates of the camera, may be located rearward on the host vehicle 22 so that the lane boundaries 30, 32, and other boundaries of the roadway 26 can be detected. Similarly, the radar device 38B, or duplicates of the radar device, may be mounted at each corner of the host vehicle 22 so that, in addition to detecting the object 56, an adjacent vehicle 58 in an adjacent lane 24 of the roadway 26 may be detected.
  • The lane positioning detector 40 of the system 20 is configured to determine a relative position (see arrow 60 in FIG. 1) of the host vehicle 22 in the lane 24 with respect to a lane centerline C, the boundaries 30, 32, or other markings/features of the roadway 26. The lane positioning detector 40 may be, or may include, an image capture device or camera 40A, a geographic navigation device 40B (e.g., global positioning system (GPS), and/or other devices configured to determine the vehicle position within the lane 24. The lane positioning detector 40 is shown mounted at the front of the host vehicle 22, but other locations such as on the roof of the host vehicle 22, or within the occupant compartment and looking through the windshield of the host vehicle 22 are also contemplated.
  • The adjacent vehicle detector 42 of the system 20 is configured to determine a distance (see arrow 62 in FIG. 1), which may be a lateral distance, measured between the adjacent vehicle 58 and the host vehicle 22. The adjacent vehicle detector 42 may include, but is not limited to, one or more of a LiDAR device 42A, a radar device 42B, and/or an image capture device or camera 42C. Other devices suitable to detect an adjacent vehicle such as a microphone and an ultrasonic transceiver are also contemplated. It is contemplated that two or more of these devices 42A, 42B, 42C may cooperate to detect and classify the adjacent vehicle 58 and measure distance 62 while the host vehicle 22 is moving. For example, information from the radar device 42B and the camera 42C may be combined to reliably detect the adjacent vehicle 58 and/or measure distance 62 ahead of the host vehicle 22. In one embodiment, the LiDAR device 42A may be preferable for determining that the adjacent vehicle 58 ahead of the host vehicle 22. However, advancements in radar and image processing of images captured by the camera 42C are expected, so those devices may be preferable in the future.
  • While the adjacent vehicle detector 42 is illustrated as being mounted at the side of the host vehicle 22, it is contemplated that the various devices 42A, 42B, 42C may be distributed and/or duplicated at various locations about the host vehicle 22. For example, the camera 42C, or duplicates of the camera, may be located rearward on the host vehicle 22 so that approaching adjacent vehicles 58 may be detected. Similarly, the radar device 42B, or duplicates of the radar device, may be mounted at each corner of the host vehicle 22 so that, in addition to detecting the adjacent vehicle 58, objects 56 adjacent to the roadway 26 may be detected. It is contemplated and understood that the detectors 38, 40, 42 may share various devices. For example, the camera 38C may be, or may be capable of functioning as, the camera 42C. Furthermore, if one camera should fail, the camera of another detector may be used by the system 20 instead.
  • The V2V transmitter 44 may be configured to transmit a host signal 64 that may generally indicate a particular position of the host vehicle 22 with the lane 24. The V2V transmitter 44 may be, but is not limited to, a Dedicated Short Range Communications (DSRC) system that uses the known 802.11P communications protocol. The V2V receiver 46 may be configured to receive an object signal 66 from the object 56 that may, for example, indicate the object (i.e., in the example of a vehicle) is parked. The object signal 66 may also include GPS data or other location data that when received by the controller 48 of the system 20, provide further information pertaining to the distance between the host vehicle 22 and the object 56. The V2V receiver may also be configured to receive an adjacent vehicle signal 68 from the adjacent vehicle 58 that may indicate the adjacent vehicle 58 is in a particular position relative to the lane 24 that the adjacent vehicle is traveling in.
  • The application 54, which may be stored in the electronic storage medium 52 of the controller 48, may include an object module 70, a lane positioning module 72, and an adjacent vehicle module 74. The object detector 38 is configured to output an object signal 76 for processing by at least the object module 70. The lane positioning detector 40 is configured to output a lane positioning signal 78 for processing by at least the lane positioning module 72. The adjacent vehicle detector 42 is configured to output an adjacent vehicle signal 80 for processing by at least the adjacent vehicle module 74. The speed sensor 43 is configured to send a speed signal 81 to the controller 48 for processing by at least the adjacent vehicle module 74 to generally determine the initiation and rate of a vehicle position shift within the lane 24. The transmitter 44 is configured to receive a transmit command 82 from the controller 48 that commands the transmitter 44 to transmit the host signal 64 (see FIG. 1) indicative of the host vehicle 22 being in a particular position with respect to the lane boundaries 30, 32. The receiver 46 is configured to receive various, external, signals, such as the object signal 66 and the adjacent vehicle signal 68, and send a receiver signal 84, indicative of at least signals 66, 68, to the controller 48 for processing.
  • In operation of the system 20, and in one scenario where the object detector 38 does not detect the object 56, the object detector 38 may send an object signal 76, indicative of ‘no object detected’, to the controller 48 for utilization by the object module 70 of the application 54. Simultaneously, the lane positioning detector 40 may be sending a lane positioning signal 78 to the controller 48 for utilization by the lane positioning module 72 of the application 54. If the lane positioning signal 78 is indicative of the host vehicle 22 being in a centered position 86 (i.e., centered to centerline C or centered within the lane 24, see FIG. 1), the lane positioning module 72, executed in unison with the object module 70, may cause the host vehicle 22 to remain in the centered position 86. If the host vehicle 22 is not in the centered position 86, the positioning module 72 may cause the controller 48 to initiate a command signal 88 to the directional unit 36A directing the host vehicle 22 to adjust toward the centered position 86 while moving in a forward direction. In one embodiment, the host vehicle 22 may remain centered to the lane 24 even when the adjacent vehicle detector 42 sends an adjacent vehicle signal 80 to the adjacent vehicle module 74 indicating an adjacent vehicle 58 is in the adjacent lane 24.
  • In another operating scenario, the object detector 38 may detect an object 56, thus may send an object signal 76, indicative of at least ‘an upcoming object detected’, to the controller 48 for utilization by the object module 70 of the application 54. Simultaneously, the lane positioning detector 40 may be sending a lane positioning signal 78 to the controller 48 for utilization by the lane positioning module 72 of the application 54. If the lane positioning signal 78 is indicative of the host vehicle 22 being in the centered position 86, the lane positioning module 72, executed in unison with the object module 70, may direct the controller 48 to send a command signal 88 to the directional unit 36A directing the host vehicle 22 to adjust toward a biased position 90 (see FIG. 1). This will cause the host vehicle 22 to shift away from the object 56, thereby increasing the clearance between the object 56 and the passing host vehicle 22. More specifically, if the object 56 is detected as being proximate to the lane boundary 32, the biased position 90 will amount to a host vehicle shift toward the opposite lane boundary 30, and vice-versa.
  • When the object detector 38 detects an object 56 ahead of the host vehicle 22 and that object is determined to be within a prescribed distance from the lane boundary 32, or it is determined that the host vehicle 22 will pass within a minimum threshold distance from the object 56, the system 20 may place the host vehicle 22 in the biased position 90 (i.e. offset position) generally at the moment the host vehicle 22 passes the object 56. In this way, the system 20 creates a safe and/or comfortable distance between the host vehicle 22 and the object 56 at the moment of passing. Also, the lane positioning detector 40 provides the necessary data to the application 54 to assure this additional clearance is maintained while passing the object 56. That is, the lane positioning module 72 with the real-time data collected by the lane positioning detector 40, verifies that the host vehicle 22 is in the biased position 90. In one example, this clearance may be about one meter (1 m). In another example, the clearance may be a function of the width of the host vehicle 22 and a width of the lane 24.
  • In one example, the lane positioning detector 40, or another detector, may detect a forward distance between the detected object 56 and the forward moving host vehicle 22. This forward distance along with the velocity of the host vehicle obtained from the speed signal 81 may be utilized by the lane positioning module 72 to determine when the movement from the centered position 86 to the biased position 90 should be initiated. This same data may also be used to determine the rate at which the host vehicle 22 moves from the centered position 86 to the biased position 90.
  • The additional clearance created between the object 56 and the host vehicle 22 may be in anticipation of the object 56 inadvertently moving into or toward the lane 24. For example, the application 54 executed by the processor 50 of the controller 48 may determine that the object 56, via for example one of the cameras 38C, 42C is a parked vehicle. As a parked vehicle, and by providing the additional clearance, the host vehicle 22 is anticipating that a driver of the parked vehicle may open a door into the lane 24 without taking proper precautions. In another example, the object 56 may be a person standing at, or besides, the lane boundary 32 that may be a curb.
  • Continuing the operating scenario, with the host vehicle 22 shifting toward the biased position 90, or being in the biased position 90, the controller 48 may send a command 82 to the transmitter 44 directing the transmitter to transmit a signal 64 indicative of the host vehicle 22 being in the biased position 90. The signal 64 may be received by the adjacent vehicle 58, if autonomous, causing the adjacent vehicle to also shift lane position.
  • Also during operation of the system 20, the V2V receiver 46 may be configured to receive the object signal 66 from the object 56. In an example where the object 56 includes GPS capability, the object signal 66 may include data indicating the position of the object relative to the roadway 26. Furthermore, the V2V receiver 46 may also be configured to receive an adjacent vehicle signal 68 from the adjacent vehicle 58 indicating that the adjacent vehicle 58 is in, or is going to enter into, a biased position 90.
  • In yet another embodiment, and when the object 56 is detected and the host vehicle 22 is in the centered position 86, the land positioning module 72 may switch from a first set of left and right error limits to a second set of left and right error limits. All of the error limits may be pre-programmed into the controller 48. The left and right error limits of the first set may be substantially equivalent to one-another. With the error limits equivalent, the host vehicle 22 may be held substantially at the centered position 86. In contrast, the left and right error limits of the second set may not be equivalent to one-another. When the second set is applied, the host vehicle adjusts, or shifts, toward the biased position 90. More specifically, if the object 56 is detected at the left boundary 32, the second set of left and right error limits applied may have a right error limit that is larger than a left error limit. In this embodiment, a ‘direct’ command to move from the centered position 86 to the biased position 90 need not be sent from the controller 48 to the directional unit 36A.
  • Accordingly, the system 20 for automated operation of the host vehicle 22 advances the automated vehicle arts by enabling a system, application, or controller to determine when a host vehicle should move from a centered position 86 within a lane 24 to a biased or off-centered position 90 to create a comfortable distance between an object 56 that may be stationary and the host vehicle 22.
  • The various functions described above may be implemented or supported by a computer program that is formed from computer readable program codes, and that is embodied in a computer readable medium. Computer readable program codes may include source codes, object codes, executable codes, and others. Computer readable mediums may be any type of media capable of being accessed by a computer, and may include Read Only Memory (ROM), Random Access Memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or other forms.
  • Terms used herein such as component, application, module, system, and the like are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software execution. By way of example, an application may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. It is understood that an application running on a server and the server, may be a component. One or more applications may reside within a process and/or thread of execution and an application may be localized on one computer and/or distributed between two or more computers
  • While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description.

Claims (20)

Having thus described the invention, it is claimed:
1. A system for semi-autonomous or autonomous operation of a host vehicle comprising:
an object detector configured to detect an object proximate to a lane boundary and output an object signal; and
a controller configured to process the object signal and direct the host vehicle away from the lane boundary upon detection of the object.
2. The system set forth in claim 1, further comprising:
a lane positioning detector configured to assist in placing the host vehicle in a centered position when the object detector has not detected the object, and assist in placing the host vehicle in a biased position away from the lane boundary when the object is detected.
3. The system set forth in claim 2, wherein controller is pre-programmed with a first set of left and right of center error limits applied to substantially maintain the host vehicle at a center position when the object is not detected, and a second set of error limits applied to substantially maintain the host vehicle at the biased position when the object is detected.
4. The system set forth in claim 2, further comprising:
a transmitter configured to transmit a signal indicative of the host vehicle moving from the centered position to the biased position for receipt by an adjacent vehicle.
5. The system set forth in claim 4, wherein the transmitter is a Dedicated Short Range Communications (DSRC) system.
6. The system set forth in claim 1, further comprising:
a receiver for receiving an object signal from the object and indicative of an object position.
7. The system set forth in claim 1, wherein the object detector is located at a forward portion of the host vehicle.
8. The system set forth in claim 1, wherein the object detector includes a camera.
9. The system set forth in claim 1, wherein the object detector includes a radar device.
10. The system set forth in claim 1, wherein the object detector includes a LiDAR device.
11. The system set forth in claim 2, wherein the lane positioning detector includes a camera.
12. The system set forth in claim 2, wherein the lane positioning detector includes a geographic navigation device.
13. The system set forth in claim 1, wherein the object is stationary.
14. The system set forth in claim 1, wherein the object is a parked vehicle.
15. An autonomous vehicle comprising:
a controller including a processor and an electronic storage medium; and
a steering unit constructed and arranged to receive a steering command from the controller for moving the autonomous vehicle from a centered position to a biased position upon receipt of an object detected signal.
16. The autonomous vehicle set forth in claim 15, further comprising:
an object detector configured to detect an object proximate to a first lane boundary of a lane upon which the autonomous vehicle is moving, and send the object detected signal to the controller.
17. The autonomous vehicle set forth in claim 16, wherein movement from the centered position to the biased position is away from the first lane boundary.
18. The autonomous vehicle set forth in claim 17, further comprising:
a lane positioning detector configured to detect an autonomous vehicle position with respect to the first lane boundary and an opposite, second, lane boundary, wherein a lane positioning signal is sent from the lane positioning detector to the controller to assist in moving the autonomous vehicle between the centered and biased positions.
19. A computer software product executed by a controller of an automated vehicle configured to receive object and lane positioning signals to control lane positioning of the automated vehicle based on the detection of an object proximate to a lane boundary, the computer software product comprising:
an object module configured to receive the object signal and determine if the object is proximate to the lane boundary; and
a lane positioning module configured to receive the lane positioning signal to determine lane positioning of the automated vehicle, and at least in-part effectuate movement of the automated vehicle from a centered position to a biased position that is away from the lane boundary if the object module determines that the object is proximate to the lane boundary.
20. The computer software product set forth in claim 19, further comprising:
a first set of left and right of center error limits utilized to substantially maintain the host vehicle at the center position when the object is not detected, and a second set of biased left and right error limits utilized to substantially maintain the host vehicle at the biased position when the object is detected.
US15/653,879 2017-07-19 2017-07-19 Variable steering error limits for automated vehicle control Abandoned US20190027034A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/653,879 US20190027034A1 (en) 2017-07-19 2017-07-19 Variable steering error limits for automated vehicle control
PCT/US2018/042465 WO2019018378A1 (en) 2017-07-19 2018-07-17 Variable steering error limits for automated vehicle control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/653,879 US20190027034A1 (en) 2017-07-19 2017-07-19 Variable steering error limits for automated vehicle control

Publications (1)

Publication Number Publication Date
US20190027034A1 true US20190027034A1 (en) 2019-01-24

Family

ID=65016148

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/653,879 Abandoned US20190027034A1 (en) 2017-07-19 2017-07-19 Variable steering error limits for automated vehicle control

Country Status (2)

Country Link
US (1) US20190027034A1 (en)
WO (1) WO2019018378A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394757A1 (en) * 2018-11-05 2021-12-23 Zoox, Inc. Vehicle trajectory modification for following
US20220161821A1 (en) * 2020-11-25 2022-05-26 Tusimple, Inc. Autonomous vehicle handling in unusual driving events
US11352023B2 (en) 2020-07-01 2022-06-07 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US11396302B2 (en) * 2020-12-14 2022-07-26 May Mobility, Inc. Autonomous vehicle safety platform system and method
US11472436B1 (en) 2021-04-02 2022-10-18 May Mobility, Inc Method and system for operating an autonomous agent with incomplete environmental information
US11565717B2 (en) 2021-06-02 2023-01-31 May Mobility, Inc. Method and system for remote assistance of an autonomous agent
US11814072B2 (en) 2022-02-14 2023-11-14 May Mobility, Inc. Method and system for conditional operation of an autonomous agent
US11827241B2 (en) 2018-10-29 2023-11-28 Motional Ad Llc Adjusting lateral clearance for a vehicle using a multi-dimensional envelope
US12012123B2 (en) 2021-12-01 2024-06-18 May Mobility, Inc. Method and system for impact-based operation of an autonomous agent

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913375A (en) * 1995-08-31 1999-06-22 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering force correction system
US20090015724A1 (en) * 2007-07-09 2009-01-15 Samsung Electronics Co., Ltd. Broadcasting processing apparatus and control method thereof
US20130197758A1 (en) * 2012-01-27 2013-08-01 Denso Corporation Vehicle automatic steering control apparatus
US8504233B1 (en) * 2012-04-27 2013-08-06 Google Inc. Safely navigating on roads through maintaining safe distance from other vehicles
US20140012188A1 (en) * 2012-07-05 2014-01-09 Pedro Orrego Silva Device for treatment of a blood vessel
US20140188345A1 (en) * 2012-12-28 2014-07-03 Fuji Jukogyo Kabushiki Kaisha Vehicle driving assistance device
US20140309884A1 (en) * 2013-04-10 2014-10-16 Magna Electronics Inc. Rear collision avoidance system for vehicle
US20150166062A1 (en) * 2013-12-12 2015-06-18 Magna Electronics Inc. Vehicle control system with traffic driving control
US20160052547A1 (en) * 2013-05-01 2016-02-25 Toyota Jidosha Kabushiki Kaisha Driving support apparatus and driving support method
US20160306357A1 (en) * 2015-04-17 2016-10-20 Delphi Technologies, Inc. Automated vehicle system with position bias for motorcycle lane splitting

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080186205A1 (en) * 1995-06-07 2008-08-07 Intelligent Technologies International, Inc. Wireless Sensing and Communications System of Roadways
WO2007145564A1 (en) * 2006-06-11 2007-12-21 Volvo Technology Corporation Method and apparatus for using an automated lane keeping system to maintain lateral vehicle spacing
US8462988B2 (en) * 2007-01-23 2013-06-11 Valeo Schalter Und Sensoren Gmbh Method and system for universal lane boundary detection
US8452535B2 (en) * 2010-12-13 2013-05-28 GM Global Technology Operations LLC Systems and methods for precise sub-lane vehicle positioning
US8473144B1 (en) * 2012-10-30 2013-06-25 Google Inc. Controlling vehicle lateral lane positioning

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913375A (en) * 1995-08-31 1999-06-22 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering force correction system
US20090015724A1 (en) * 2007-07-09 2009-01-15 Samsung Electronics Co., Ltd. Broadcasting processing apparatus and control method thereof
US20130197758A1 (en) * 2012-01-27 2013-08-01 Denso Corporation Vehicle automatic steering control apparatus
US8504233B1 (en) * 2012-04-27 2013-08-06 Google Inc. Safely navigating on roads through maintaining safe distance from other vehicles
US20140012188A1 (en) * 2012-07-05 2014-01-09 Pedro Orrego Silva Device for treatment of a blood vessel
US20140188345A1 (en) * 2012-12-28 2014-07-03 Fuji Jukogyo Kabushiki Kaisha Vehicle driving assistance device
US20140309884A1 (en) * 2013-04-10 2014-10-16 Magna Electronics Inc. Rear collision avoidance system for vehicle
US20160052547A1 (en) * 2013-05-01 2016-02-25 Toyota Jidosha Kabushiki Kaisha Driving support apparatus and driving support method
US20150166062A1 (en) * 2013-12-12 2015-06-18 Magna Electronics Inc. Vehicle control system with traffic driving control
US20160306357A1 (en) * 2015-04-17 2016-10-20 Delphi Technologies, Inc. Automated vehicle system with position bias for motorcycle lane splitting

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11827241B2 (en) 2018-10-29 2023-11-28 Motional Ad Llc Adjusting lateral clearance for a vehicle using a multi-dimensional envelope
US20210394757A1 (en) * 2018-11-05 2021-12-23 Zoox, Inc. Vehicle trajectory modification for following
US11970168B2 (en) * 2018-11-05 2024-04-30 Zoox, Inc. Vehicle trajectory modification for following
US11352023B2 (en) 2020-07-01 2022-06-07 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US11565716B2 (en) 2020-07-01 2023-01-31 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US11667306B2 (en) 2020-07-01 2023-06-06 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US20220161821A1 (en) * 2020-11-25 2022-05-26 Tusimple, Inc. Autonomous vehicle handling in unusual driving events
US11648961B2 (en) * 2020-11-25 2023-05-16 Tusimple, Inc. Autonomous vehicle handling in unusual driving events
US11396302B2 (en) * 2020-12-14 2022-07-26 May Mobility, Inc. Autonomous vehicle safety platform system and method
US11673566B2 (en) 2020-12-14 2023-06-13 May Mobility, Inc. Autonomous vehicle safety platform system and method
US11673564B2 (en) 2020-12-14 2023-06-13 May Mobility, Inc. Autonomous vehicle safety platform system and method
US11679776B2 (en) 2020-12-14 2023-06-20 May Mobility, Inc. Autonomous vehicle safety platform system and method
US11472436B1 (en) 2021-04-02 2022-10-18 May Mobility, Inc Method and system for operating an autonomous agent with incomplete environmental information
US11745764B2 (en) 2021-04-02 2023-09-05 May Mobility, Inc. Method and system for operating an autonomous agent with incomplete environmental information
US11845468B2 (en) 2021-04-02 2023-12-19 May Mobility, Inc. Method and system for operating an autonomous agent with incomplete environmental information
US11565717B2 (en) 2021-06-02 2023-01-31 May Mobility, Inc. Method and system for remote assistance of an autonomous agent
US12012123B2 (en) 2021-12-01 2024-06-18 May Mobility, Inc. Method and system for impact-based operation of an autonomous agent
US11814072B2 (en) 2022-02-14 2023-11-14 May Mobility, Inc. Method and system for conditional operation of an autonomous agent

Also Published As

Publication number Publication date
WO2019018378A1 (en) 2019-01-24

Similar Documents

Publication Publication Date Title
US20190027034A1 (en) Variable steering error limits for automated vehicle control
US10710580B2 (en) Tailgating situation handling by an automated driving vehicle
US10919525B2 (en) Advanced driver assistance system, vehicle having the same, and method of controlling the vehicle
US10497261B2 (en) Traffic blocking avoidance system for an automated vehicle
US10259453B2 (en) Collision avoidance based on front wheel off tracking during reverse operation
US9889847B2 (en) Method and system for driver assistance for a vehicle
US20210197807A1 (en) Advanced driver assistance system, vehicle having the same, and method of controlling vehicle
KR102660838B1 (en) Vehicle and method for controlling thereof
US20160306357A1 (en) Automated vehicle system with position bias for motorcycle lane splitting
US20210086768A1 (en) Driving assistance control apparatus, driving assistance system, and driving assistance control method for vehicle
WO2018002984A1 (en) Vehicle control method and vehicle control device
CN111661047B (en) Lane position sensing and tracking in a vehicle
KR20200086764A (en) Vehicle and method for controlling thereof
CN108010385B (en) Automatic vehicle cross traffic detection system
US20190025433A1 (en) Automated vehicle lidar tracking system for occluded objects
CN112193246A (en) Vehicle and method for performing inter-vehicle distance control
US11195417B2 (en) Vehicle and method for predicating collision
US11370489B2 (en) Vehicle and method for steering avoidance control
CN112455433A (en) Vehicle and method of controlling the same
KR102356612B1 (en) Collision Avoidance device, Vehicle having the same and method for controlling the same
KR20220119229A (en) Advanced Driver Assistance System, and Vehicle having the same
US20200377079A1 (en) Vehicle and method of controlling thereof
US10909851B2 (en) Vehicle intent communication system
CN115605385A (en) Method for guiding a motor vehicle
EP4230490A1 (en) Lane keeping assist and a method for assisting a lane keeping

Legal Events

Date Code Title Description
AS Assignment

Owner name: DELPHI TECHNOLOGIES, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XU, WENDA;WEI, JUNQING;REEL/FRAME:043043/0600

Effective date: 20170626

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELPHI TECHNOLOGIES INC.;REEL/FRAME:047153/0902

Effective date: 20180101

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MOTIONAL AD LLC, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:053863/0399

Effective date: 20200917