Zerush@lemmy.ml to Technology@lemmy.ml · 3 days agoGoogle's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comexternal-linkmessage-square25fedilinkarrow-up198arrow-down14cross-posted to: [email protected][email protected]
arrow-up194arrow-down1external-linkGoogle's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comZerush@lemmy.ml to Technology@lemmy.ml · 3 days agomessage-square25fedilinkcross-posted to: [email protected][email protected]
minus-squareutopiah@lemmy.mllinkfedilinkarrow-up3·2 days agoWell there are guardrails from what I understood, including : executing commands (off by default) executing commands without user confirmation (off by default) which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
minus-squareScrubbles@poptalk.scrubbles.techlinkfedilinkEnglisharrow-up1·2 days agoOkay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that
Well there are guardrails from what I understood, including :
which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
Okay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that