- Top 5 tủ lạnh Toshiba bán chạy nhất tháng 10/2018 tại Điện máy XANH
- Giá bán kèm hình ảnh của Nokia 8 bất ngờ xuất hiện
- Cách dò kênh trên Internet tivi LG 2016
- Mẹo dùng bình đun siêu tốc tiết kiệm điện
- 5 lý do để bạn chọn bình nóng lạnh Ariston
- Cách làm Bibimbap Hàn Quốc kiểu Việt thật đơn giản
- Sáng tạo với ly bánh Oreo độc nhất vô nhị
- Trên tay nhanh iPhone 6 – Rất mỏng, màn hình đẹp
- 5 đồ uống quen thuộc giúp bạn giảm cân siêu nhanh
- Cách cho nước vào bàn ủi hơi nước
Đánh giá về Việc Air Canada Cố Bất Chấp Trách Nhiệm đối với Lời Hứa Sai Lầm của Chatbot trong Tòa Án – Nhưng Không Thành Công
#AirCanada #ChatbotResponsibility #CustomerService #LegalAction #BereavementPolicy #AirlineRefund #AI #FalsePromises #BusinessResponsibility #CustomerSatisfaction #SmallClaimsCourt #AirCanadaDispute #CustomerCare
Air Canada đã gặp rắc rối khi cố trốn tránh trách nhiệm với những lời hứa sai lầm do chatbot của mình mang lại. Chatbot của hãng hàng không đã đưa ra lời hứa không thể thực hiện được, dẫn đến một vụ kiện. Tuy nhiên, tòa án đã phán quyết chống lại nỗ lực của Air Canada để trốn trách nhiệm, gửi đi một thông điệp rõ ràng rằng các công ty không thể đơn giản là thoải mái thoát khỏi hành động của chatbot của họ. Vụ việc này nhắc nhở rằng các doanh nghiệp phải đảm bảo rằng các hệ thống tự động của họ chính xác và chịu trách nhiệm với giao tiếp với khách hàng. Việc không thực hiện điều này có thể dẫn đến hậu quả pháp lý và tổn thương danh tiếng của một công ty. Khi công nghệ tiếp tục đóng một vai trò quan trọng trong giao tiếp với khách hàng, các công ty cần ưu tiên tính trung thực và đáng tin cậy của các hệ thống tự động của họ.
#AirCanada #ChatbotResponsibility #CustomerService #LegalAction #BereavementPolicy #AirlineRefund #AI #FalsePromises #BusinessResponsibility #CustomerSatisfaction #SmallClaimsCourt #AirCanadaDispute #CustomerCare
Air Canada found itself in hot water recently when it attempted to evade responsibility for the false promises made by its chatbot. The airline’s chatbot had made promises to customers that Air Canada was unable to fulfill, leading to a lawsuit. However, the court ruled against Air Canada’s attempt to shirk responsibility, sending a clear message that companies cannot simply wash their hands of the actions of their chatbots. This case serves as a reminder that businesses must ensure that their automated systems are accurate and accountable for their interactions with customers. Failure to do so can result in legal repercussions and damage to a company’s reputation. As technology continues to play a larger role in customer interactions, it is imperative that companies prioritize the integrity and reliability of their automated systems.
Can AI chatbots ease the burden on customer service representatives? Lots of businesses seem to think so, but they better hope their chatbots don’t cause the same problem as Air Canada’s. The airline has just been forced to offer a partial refund to a customer, honoring a refund policy that its chatbot seemingly made up on the spot.
The incident in question happened to Jack Moffat, who went to Air Canada’s chatbot to help him understand the airline’s bereavement travel policy following the death of his grandmother. The chatbot explained that it was possible to book a flight immediately and request a partial refund within 90- days.
Unfortunately Air Canada’s bereavement travel policy states the airline won’t provide refunds for travel after the flight has been booked. Because of this Moffatt’s refund request was rejected. Air Canada admitted the bot was at fault, and instead offered a $200 flight voucher and promised to update the chatbot to ensure this doesn’t happen again.
Not satisfied, Moffatt filed a small claims complaint with Canada’s Civil Resolution Tribunal.
Air Canada’s key argument was that because the chatbot had linked to the official bereavement policy page, Moffatt should have known what the situation was. It also argued that the chatbot shouldn’t have been trusted in the first place, and that the airline could not be held responsible since the bit was a “separate legal entity”.
Obviously the tribunal wasn’t buying this excuse, especially since Air Canada didn’t explain why it’s not liable for information provided by its agents — including chatbots and human customer service representatives. The tribunal also ruled that Air Canada also didn’t explain why customers should be responsible for fact checking, and that Moffatt had no reason to believe the chatbot wasn’t supplying accurate information.
Moffatt was awarded a partial refund to the sum of $650.88 CAD ($428.29), plus additional damages to cover interest and the tribunal fees.
It’s rather bewildering that Air Canada would ever try to argue this in court. The idea that a company can offer a chatbot in place of a human customer service rep, then refuse to accept responsibility for when things go wrong, is pretty ridiculous. Especially when such a small amount of money is on the line.
Adam Leon Smith, Chair of F-TAG, the technical advisory group of BCS and a leading expert in AI safety, told Tom’s Guide that “It is amazing Air Canada even tried to fight this claim. In this context, advice given by a chatbot is obviously equivalent in standing to advice given on their web page.” Later adding that, “deployers of AI need to understand its limitations, and that they ultimately remain responsible for mistakes AI makes on their behalf”.
Meanwhile Ryan Carrier, CEO of AI safety and certification organization forHumanity, similarly criticized Air Canada’s response and the precedent it could have set if successful. Carrier told Tom’s Guide that forHumanity has “consistently argued that all tools such as AI, Algorithmic, and Autonomous Systems should always have a responsible, accountable beneficial owner”. In other words, companies need to know the risks of using chatbots and similar tools, and if your chatbot is going to go off the rails and make a bunch of promises you better be willing to back them up.
Or, you know, stop trying to cut corners in customer service and hire extra human operatives who can be trained and instructed to not make up company policy on the fly. That way customers know what the deal is and companies don’t have to deal with quite as many irate complaints over false promises.
Plus human operators can be fired if they swear at customers, or spend write poetry that criticizes their employer during work hours.
More from Tom’s Guide