New York regulators have announced plans to issue cybersecurity regulations for hospitals, after a series of attacks crippled operations at medical facilities.
Under draft rules, New York will require general hospitals to develop and test incident response plans, assess their cybersecurity risks and install security technologies such as multifactor authentication (MFA). Hospitals must also develop secure software design practices for in-house applications and create processes for testing the security of software from vendors.
“Our interconnected world demands an interconnected defense against cyber-attacks, leveraging every resource available, especially at hospitals,” Governor Kathy Hochul said. “These new proposed regulations set forth a nation-leading blueprint to ensure New York State stands ready and resilient in the face of cyber threats.”
The limitations with this disclosure is that it is up to creators to make sure their videos meet the criteria, and the disclosure will be visible only in the video’s description field.
The proposed regulations require that hospitals develop response plans for a potential cybersecurity incident, including notification to appropriate parties. Hospitals will also be required to run tests of their response plan to ensure that patient care continues while systems are restored back to normal operations.
The proposed regulations mandate that each hospital’s cybersecurity program includes written procedures, guidelines, and standards to develop secure practices for in-house applications intended for use by the facility. Hospitals will also be required to establish policies and procedures for evaluating, assessing, and testing the security of externally developed applications used by the hospital.
And hospitals must establish a Chief Information Security Officer role, if one does not exist already, to enforce the new policies and to annually review and update them as needed.
The state published a comprehensive cybersecurity strategy in August, and at that time the governor said she would be exploring regulation for critical infrastructure sectors, including healthcare.
And in early November, the New York Department of Financial Services updated its Part 500 cybersecurity rules, requiring banks, insurers and other financial firms to implement MFA and develop written processes for dealing with cyberattacks.
YouTube will require AI disclosure
On Tuesday, YouTube announced a series of policy changes designed to inform viewers when content they are viewing has been generated by artificial intelligence (AI).
The policies are slated to go into effect next year, and they will include the following:
- A requirement that content creators disclose when generative AI was used to create realistic-looking scenes that never actually occurred or depicting people saying things they never said.
- Allowing users to submit a request for YouTube to remove content that simulates an identifiable person, including their face or voice. In this regard, YouTube warns people that not all requests will be honored, as there will be a higher threshold for requests to remove pure satire, parody and for content imitating public figures.
- Creating a separate process for certain music industry partners to request removal of content that “mimics an artist’s unique singing or rapping voice”.
YouTube said in a blog post that it has always had longstanding policies that prohibit technically manipulated content that misleads viewers and could pose a serious risk of egregious harm. And it has always maintained standards governing violence and hate speech. Those won’t change.
The limitation with this disclosure is that it is up to creators to make sure their videos meet the criteria, and the disclosure is going to be visible only in the video’s description field.
AI takes that tone with earnings calls
Many funds already use algorithms to comb through transcripts of earnings calls and company presentations to glean signals from executives’ choice of words, a process known as Natural Language Processing. But now they are trying to find further messages in the way those words are spoken, the Financial Review reports.
“The idea is that audio captures more than just what is in text,” said Mike Chen, head of alternative alpha research at Robeco, an asset manager based in the Netherlands. “Even if you have a sophisticated semantic machine, it only captures semantics.”
“Our interconnected world demands an interconnected defense against cyber-attacks, leveraging every resource available, especially at hospitals.”
Governor Kathy Hochul
Hesitation and filler words tend to be left out of transcripts, and AI can also pick up some “microtremors” that are imperceptible to the human ear.
Robeco, which manages over $80 billion in algorithmically driven funds, began adding audio signals picked up through AI into its strategies earlier this year. Chen said it had added to returns, and that he expected more investors to follow suit.