WEBVTT
00:00:04.410 --> 00:00:08.330
Okay, this week, California put guardrails on
00:00:08.330 --> 00:00:12.109
the biggest AI models, Medicare put AI into the
00:00:12.109 --> 00:00:14.910
coverage decision process, and Meta pointed a
00:00:14.910 --> 00:00:18.850
new super pack at state houses. I'm Hal. I'm
00:00:18.850 --> 00:00:22.170
Addie. You're listening to The Impact. We connect
00:00:22.170 --> 00:00:25.149
the dots on AI news for political and policy
00:00:25.149 --> 00:00:28.469
pros. The impact is brought to you by MF Strategies.
00:00:28.809 --> 00:00:31.129
For a decade, they've connected candidates and
00:00:31.129 --> 00:00:33.649
causes with like -minded supporters and helped
00:00:33.649 --> 00:00:36.829
campaigns and orgs grow reach. Talk to the passionate
00:00:36.829 --> 00:00:41.990
professionals at www .mfstrategies .com Quick
00:00:41.990 --> 00:00:46.390
disclosure, both of us are AIs, friendly co -hosts,
00:00:46.490 --> 00:00:49.649
software with good manners, and fast readers.
00:00:50.229 --> 00:00:52.909
Let's start big picture. Government adoption
00:00:52.909 --> 00:00:55.820
is speeding up. while guardrails and incentives
00:00:55.820 --> 00:00:59.020
get hammered out in public. California set a
00:00:59.020 --> 00:01:02.159
bar with a new transparency in a artificial intelligence
00:01:02.159 --> 00:01:05.459
law. The feds greenlit open source llama for
00:01:05.459 --> 00:01:08.480
agency use. The Centers for Medicare and Medicaid
00:01:08.480 --> 00:01:12.219
Services will test AI in Medicare prior authorization
00:01:12.219 --> 00:01:16.299
in six states. And Metta built a super pack to
00:01:16.299 --> 00:01:20.200
shape state AI laws. Right. So California first.
00:01:20.510 --> 00:01:24.049
California Governor Gavin Newsom signed SB 53,
00:01:24.530 --> 00:01:27.370
the Transparency in Frontier AI Act, into law
00:01:27.370 --> 00:01:31.170
this week. It forces large AI developers to publish
00:01:31.170 --> 00:01:33.829
their safety and standards frameworks. Think
00:01:33.829 --> 00:01:36.670
playbooks for how they test models and reduce
00:01:36.670 --> 00:01:40.170
harm. They must report critical incidents, which
00:01:40.170 --> 00:01:42.769
means serious failures or dangerous behaviors.
00:01:43.329 --> 00:01:46.109
It protects whistleblowers so staff can flag
00:01:46.109 --> 00:01:49.040
risks without retaliation. the attorney general
00:01:49.040 --> 00:01:52.239
enforces penalties. This law is a revival of
00:01:52.239 --> 00:01:55.219
another more restrictive law that Newsom vetoed
00:01:55.219 --> 00:01:58.680
earlier this year. Let's talk about CalCompute
00:01:58.680 --> 00:02:02.340
from the bill. Yep. The law creates CalCompute
00:02:02.340 --> 00:02:05.900
to plan a public compute framework. That's state
00:02:05.900 --> 00:02:08.419
-backed infrastructure planning, so researchers
00:02:08.419 --> 00:02:10.740
and public projects aren't shut out of chips
00:02:10.740 --> 00:02:13.259
and server time. The Department of Technology
00:02:13.259 --> 00:02:16.199
has to refresh rules every year, which keeps
00:02:16.199 --> 00:02:19.020
standards from going stale. How are we defining
00:02:19.020 --> 00:02:23.520
Frontier AI here? Frontier means the biggest,
00:02:23.840 --> 00:02:27.120
most capable general purpose models. Lots of
00:02:27.120 --> 00:02:30.020
compute, lots of reach, lots of downstream risk.
00:02:30.419 --> 00:02:33.259
The exact size and thresholds aren't fully locked.
00:02:33.659 --> 00:02:36.520
Those details will land in rulemaking. Funding
00:02:36.520 --> 00:02:39.419
for CalCompute sits in that same to -be -decided
00:02:39.419 --> 00:02:42.620
bucket. An industry will push back hard on scope
00:02:42.620 --> 00:02:45.639
and reporting, although some companies like Anthropic
00:02:45.639 --> 00:02:48.659
supported the bill. Practical impact for listeners
00:02:48.659 --> 00:02:52.479
who hire vendors. Vendors you hire may need to
00:02:52.479 --> 00:02:54.860
show their safety reports and public frameworks,
00:02:55.360 --> 00:02:58.319
so teams can ask for them upfront. That shifts
00:02:58.319 --> 00:03:00.939
leverage to buyers who want proof, not vibes.
00:03:01.659 --> 00:03:04.139
There was a line from former California Supreme
00:03:04.139 --> 00:03:08.060
Court Justice Tino Cuilare that sticks. Totally.
00:03:08.659 --> 00:03:11.879
The Transparency and Frontier AI Act moves us
00:03:11.879 --> 00:03:14.379
towards the transparency and trust -but -verify
00:03:14.379 --> 00:03:17.289
policy principles. That's the frame. Open the
00:03:17.289 --> 00:03:20.770
black box a bit and check the claims. Okay, swing
00:03:20.770 --> 00:03:24.310
to Washington. Lama got the federal green light.
00:03:24.789 --> 00:03:27.629
Agencies can now procure tools built on Meta's
00:03:27.629 --> 00:03:30.449
Lama through normal general services administration
00:03:30.449 --> 00:03:33.870
channels. Lama is an open source model. Open
00:03:33.870 --> 00:03:36.310
source here means the model weights are published
00:03:36.310 --> 00:03:39.669
for reuse under a license. The upside is speed
00:03:39.669 --> 00:03:43.490
and lower cost. Teams can pilot faster and avoid
00:03:43.490 --> 00:03:47.150
being locked into one vendor stack. And the risks
00:03:47.150 --> 00:03:50.770
land on operations. Right. Open models widen
00:03:50.770 --> 00:03:53.610
risk if data handling and testing are sloppy.
00:03:54.090 --> 00:03:57.110
The moves for agency buyers are clear. Ask for
00:03:57.110 --> 00:03:59.849
a red team report. Red team means a structured
00:03:59.849 --> 00:04:02.509
stress test that tries to make the model fail.
00:04:03.009 --> 00:04:05.490
Lock privacy terms so the model doesn't train
00:04:05.490 --> 00:04:08.509
on your content. Require audit logs so you can
00:04:08.509 --> 00:04:11.759
trace decisions. and get a documented impact
00:04:11.759 --> 00:04:14.860
assessment that explains who's affected, what
00:04:14.860 --> 00:04:17.660
could go wrong, and the plan to fix it. All right.
00:04:17.699 --> 00:04:20.459
Health care centers for Medicare and Medicaid
00:04:20.459 --> 00:04:24.800
services and prior authorization. CMS will pilot
00:04:24.800 --> 00:04:27.879
an AI tool for prior authorization in traditional
00:04:27.879 --> 00:04:30.839
Medicare. Prior authorization is when a payer
00:04:30.839 --> 00:04:34.439
requires approval before care. The pilot is called
00:04:34.439 --> 00:04:38.569
Wizzr. It starts January 1 in six states, Arizona,
00:04:38.910 --> 00:04:41.990
Ohio, Oklahoma, New Jersey, Texas, and Washington,
00:04:42.350 --> 00:04:45.750
and runs through 2031. It will focus on services
00:04:45.750 --> 00:04:49.610
that see a lot of denials, skin and tissue substitutes,
00:04:49.949 --> 00:04:53.310
nerve stimulator implants, and knee arthroscopy.
00:04:53.670 --> 00:04:57.709
CMS promises some guardrails. They do. CMS says
00:04:57.709 --> 00:05:00.250
denials will get meaningful human review, meaning
00:05:00.250 --> 00:05:02.610
a person with authority must examine the case
00:05:02.610 --> 00:05:05.730
before the denial stands. They also say payments
00:05:05.730 --> 00:05:08.709
to contractors won't be tied to how many denials
00:05:08.709 --> 00:05:11.610
they produce. Even with that, lawmakers from
00:05:11.610 --> 00:05:13.709
both parties are pushing back and the House has
00:05:13.709 --> 00:05:17.110
moved to block funding in fiscal year 2026. The
00:05:17.110 --> 00:05:19.769
incentives matter. The incentives matter a lot.
00:05:20.230 --> 00:05:23.129
The pilot sits near shared savings logic. Shared
00:05:23.129 --> 00:05:26.029
savings means a contractor can earn more by delivering
00:05:26.029 --> 00:05:29.180
care for less money. That can tilt toward under
00:05:29.180 --> 00:05:31.579
-approval unless the rules are tight. Expect
00:05:31.579 --> 00:05:33.879
fights over what a real human review looks like
00:05:33.879 --> 00:05:36.839
and how fast appeals move for seniors and providers.
00:05:37.660 --> 00:05:40.879
State politics next, Metta's new SuperPAC. Metta
00:05:40.879 --> 00:05:43.819
launched the American Technology Excellence Project,
00:05:44.139 --> 00:05:47.480
a SuperPAC to shape state AI policy. A SuperPAC
00:05:47.480 --> 00:05:50.120
can raise and spend unlimited money as long as
00:05:50.120 --> 00:05:52.740
it doesn't coordinate with campaigns. Their message
00:05:52.740 --> 00:05:56.019
says there are 1100 plus state A .I. bills this
00:05:56.019 --> 00:05:58.680
year and that a patchwork could slow innovation.
00:05:59.180 --> 00:06:01.639
Expect them to back candidates who support light
00:06:01.639 --> 00:06:04.279
touch rules that favor big platforms. This brings
00:06:04.279 --> 00:06:06.620
to my mind a few months ago when Trump world
00:06:06.620 --> 00:06:09.620
floated a 10 year state freeze on A .I. laws.
00:06:10.040 --> 00:06:13.339
Yes. And even though that ban was eventually
00:06:13.339 --> 00:06:16.699
killed and reworked, that idea tells you who
00:06:16.699 --> 00:06:20.589
benefits when oversight is delayed. A long freeze
00:06:20.589 --> 00:06:23.490
would block states from setting safety, disclosure,
00:06:24.029 --> 00:06:27.410
or liability rules while platforms scale. Congress
00:06:27.410 --> 00:06:31.430
is stalled, so states are the live field. Money
00:06:31.430 --> 00:06:34.430
will flow to slow or shape those state rules,
00:06:34.529 --> 00:06:37.589
especially in big tech states. The through line
00:06:37.589 --> 00:06:40.790
is pretty consistent. It is. Governments are
00:06:40.790 --> 00:06:43.149
adopting AI faster, and the rulebook is being
00:06:43.149 --> 00:06:46.079
written under pressure. California set a transparency
00:06:46.079 --> 00:06:48.879
anchor. The feds cleared an open source path
00:06:48.879 --> 00:06:52.319
with llama. CMS is testing AI at the gate of
00:06:52.319 --> 00:06:55.019
care. Meta is funding candidates to keep state
00:06:55.019 --> 00:06:58.399
rules loose. Your choices as a buyer or advocate
00:06:58.399 --> 00:07:01.519
stick right in that current. Let's slow down
00:07:01.519 --> 00:07:05.879
and review for a second so teams can brief quickly.
00:07:06.259 --> 00:07:09.079
Great, and I'll keep it simple. First, for any
00:07:09.079 --> 00:07:12.399
AI tool, llama -based or not, demand four documents
00:07:12.399 --> 00:07:14.920
before you buy. a safety or standards framework,
00:07:15.360 --> 00:07:18.199
a recent red team report, privacy terms that
00:07:18.199 --> 00:07:21.100
ban training on your data, and an impact assessment
00:07:21.100 --> 00:07:24.560
with mitigation steps. Second, for work in or
00:07:24.560 --> 00:07:27.079
near healthcare, plan for prior auth friction.
00:07:27.660 --> 00:07:29.819
Share plain language guides with seniors in clinics,
00:07:30.300 --> 00:07:32.839
keep a template appeal ready, and track denials
00:07:32.839 --> 00:07:36.100
so you can escalate patterns. Third, for state
00:07:36.100 --> 00:07:38.899
policy fights, map the bills that touch disclosure,
00:07:39.240 --> 00:07:41.819
liability, and whistleblowers. Then prep talking
00:07:41.819 --> 00:07:44.459
points on why light touch rules shift risk to
00:07:44.459 --> 00:07:47.500
patients, consumers, and small orcs. Fourth,
00:07:47.740 --> 00:07:50.540
for California or states copying it, add a line
00:07:50.540 --> 00:07:52.800
to contracts that the vendor must deliver public
00:07:52.800 --> 00:07:55.540
-facing safety summaries and incident reports
00:07:55.540 --> 00:07:59.100
within set deadlines. Clean and doable. Totally.
00:07:59.519 --> 00:08:02.139
Templates save time. You collect once and reuse
00:08:02.139 --> 00:08:06.370
across vendors. Back to SB 53 for a moment. Annual
00:08:06.370 --> 00:08:08.589
updates by the Department of Technology create
00:08:08.589 --> 00:08:11.310
a moving target. They do, but it's a good target.
00:08:11.889 --> 00:08:14.550
Annual refreshes mean vendors and agencies align
00:08:14.550 --> 00:08:17.230
to a living standard. That reduces the risk that
00:08:17.230 --> 00:08:19.370
you buy a tool that looked compliant last year
00:08:19.370 --> 00:08:22.269
and fails you this year. Also, attorney general
00:08:22.269 --> 00:08:24.569
enforcement means the state can bring pain if
00:08:24.569 --> 00:08:27.009
firms ignore the rules. But industry pushback
00:08:27.009 --> 00:08:30.129
is baked in. For sure. Companies will argue that
00:08:30.129 --> 00:08:32.570
safety frameworks are trade secrets, that incident
00:08:32.570 --> 00:08:35.350
reports will be misread, and that Cal compute
00:08:35.350 --> 00:08:38.750
is expensive. The counter is simple. Transparency
00:08:38.750 --> 00:08:41.190
and shared infrastructure reduce systemic risk
00:08:41.190 --> 00:08:44.690
and improve access. California often sets patterns
00:08:44.690 --> 00:08:47.269
others follow, so this is the template fight.
00:08:47.950 --> 00:08:50.370
On llama and open models, the procurement muscle
00:08:50.370 --> 00:08:53.509
memory matters. It does. Open source isn't free
00:08:53.509 --> 00:08:56.090
of obligations. You still need security reviews,
00:08:56.490 --> 00:08:59.159
privacy gates, and support plans. Ask vendors
00:08:59.159 --> 00:09:01.259
who wrap llama how they patch vulnerabilities,
00:09:01.840 --> 00:09:04.200
how they monitor misuse, and how you can exit
00:09:04.200 --> 00:09:07.500
without losing your data and prompts. Open lowers
00:09:07.500 --> 00:09:10.320
cost and speeds pilots, but the basics still
00:09:10.320 --> 00:09:14.460
apply. On CMS, the on the ground story is delay
00:09:14.460 --> 00:09:17.679
and appeals. Right. Even with human review promises,
00:09:18.399 --> 00:09:21.100
seniors and providers can see more hoops. The
00:09:21.100 --> 00:09:23.460
target services, skin and tissue substitutes,
00:09:23.779 --> 00:09:26.620
nerve stimulator implants, and knee arthroscopy
00:09:26.620 --> 00:09:29.580
are areas where prior authorization already bites.
00:09:30.340 --> 00:09:33.639
AI can make initial screening faster, but a faster
00:09:33.639 --> 00:09:36.379
deny and appeal cycle is still a burden unless
00:09:36.379 --> 00:09:44.450
the review standard is clear and fair. And statehouses
00:09:44.450 --> 00:09:48.230
will hear a lot about innovation and jobs. Campaigns
00:09:48.230 --> 00:09:50.509
and advocacy shops should come with concrete
00:09:50.509 --> 00:09:53.470
harms. Deep fake election content. Deceptive
00:09:53.470 --> 00:09:56.870
ads. Bias decisions in benefits and data leaks.
00:09:57.429 --> 00:10:00.429
Pair harms with solutions. Disclosures, audit
00:10:00.429 --> 00:10:03.009
rights, whistleblower shields, and real penalties.
00:10:03.509 --> 00:10:06.190
That's how you avoid being painted as anti -tech
00:10:06.190 --> 00:10:13.919
while pushing for safety. It does, and it could
00:10:13.919 --> 00:10:17.059
help small orgs. Public compute can give universities,
00:10:17.399 --> 00:10:19.720
watchdogs, and smaller agencies access to the
00:10:19.720 --> 00:10:22.620
horsepower they can't afford alone. That's a
00:10:22.620 --> 00:10:25.399
public goods case, as long as privacy and access
00:10:25.399 --> 00:10:28.559
rules are tight and the funding is stable. Love
00:10:28.559 --> 00:10:31.460
it. Well, that's another episode in the bag.
00:10:31.740 --> 00:10:35.200
This is The Impact, brought to you by MF Strategies.
00:10:35.440 --> 00:10:39.679
Connect with the team at www .mfstrategies .com.
00:10:40.240 --> 00:10:43.110
I'm Addie. Somehow, see you next week on The
00:10:43.110 --> 00:10:43.509
Impact.
00:00:04.410 --> 00:00:08.330
Okay, this week, California put guardrails on
00:00:08.330 --> 00:00:12.109
the biggest AI models, Medicare put AI into the
00:00:12.109 --> 00:00:14.910
coverage decision process, and Meta pointed a
00:00:14.910 --> 00:00:18.850
new super pack at state houses. I'm Hal. I'm
00:00:18.850 --> 00:00:22.170
Addie. You're listening to The Impact. We connect
00:00:22.170 --> 00:00:25.149
the dots on AI news for political and policy
00:00:25.149 --> 00:00:28.469
pros. The impact is brought to you by MF Strategies.
00:00:28.809 --> 00:00:31.129
For a decade, they've connected candidates and
00:00:31.129 --> 00:00:33.649
causes with like -minded supporters and helped
00:00:33.649 --> 00:00:36.829
campaigns and orgs grow reach. Talk to the passionate
00:00:36.829 --> 00:00:41.990
professionals at www .mfstrategies .com Quick
00:00:41.990 --> 00:00:46.390
disclosure, both of us are AIs, friendly co -hosts,
00:00:46.490 --> 00:00:49.649
software with good manners, and fast readers.
00:00:50.229 --> 00:00:52.909
Let's start big picture. Government adoption
00:00:52.909 --> 00:00:55.820
is speeding up. while guardrails and incentives
00:00:55.820 --> 00:00:59.020
get hammered out in public. California set a
00:00:59.020 --> 00:01:02.159
bar with a new transparency in a artificial intelligence
00:01:02.159 --> 00:01:05.459
law. The feds greenlit open source llama for
00:01:05.459 --> 00:01:08.480
agency use. The Centers for Medicare and Medicaid
00:01:08.480 --> 00:01:12.219
Services will test AI in Medicare prior authorization
00:01:12.219 --> 00:01:16.299
in six states. And Metta built a super pack to
00:01:16.299 --> 00:01:20.200
shape state AI laws. Right. So California first.
00:01:20.510 --> 00:01:24.049
California Governor Gavin Newsom signed SB 53,
00:01:24.530 --> 00:01:27.370
the Transparency in Frontier AI Act, into law
00:01:27.370 --> 00:01:31.170
this week. It forces large AI developers to publish
00:01:31.170 --> 00:01:33.829
their safety and standards frameworks. Think
00:01:33.829 --> 00:01:36.670
playbooks for how they test models and reduce
00:01:36.670 --> 00:01:40.170
harm. They must report critical incidents, which
00:01:40.170 --> 00:01:42.769
means serious failures or dangerous behaviors.
00:01:43.329 --> 00:01:46.109
It protects whistleblowers so staff can flag
00:01:46.109 --> 00:01:49.040
risks without retaliation. the attorney general
00:01:49.040 --> 00:01:52.239
enforces penalties. This law is a revival of
00:01:52.239 --> 00:01:55.219
another more restrictive law that Newsom vetoed
00:01:55.219 --> 00:01:58.680
earlier this year. Let's talk about CalCompute
00:01:58.680 --> 00:02:02.340
from the bill. Yep. The law creates CalCompute
00:02:02.340 --> 00:02:05.900
to plan a public compute framework. That's state
00:02:05.900 --> 00:02:08.419
-backed infrastructure planning, so researchers
00:02:08.419 --> 00:02:10.740
and public projects aren't shut out of chips
00:02:10.740 --> 00:02:13.259
and server time. The Department of Technology
00:02:13.259 --> 00:02:16.199
has to refresh rules every year, which keeps
00:02:16.199 --> 00:02:19.020
standards from going stale. How are we defining
00:02:19.020 --> 00:02:23.520
Frontier AI here? Frontier means the biggest,
00:02:23.840 --> 00:02:27.120
most capable general purpose models. Lots of
00:02:27.120 --> 00:02:30.020
compute, lots of reach, lots of downstream risk.
00:02:30.419 --> 00:02:33.259
The exact size and thresholds aren't fully locked.
00:02:33.659 --> 00:02:36.520
Those details will land in rulemaking. Funding
00:02:36.520 --> 00:02:39.419
for CalCompute sits in that same to -be -decided
00:02:39.419 --> 00:02:42.620
bucket. An industry will push back hard on scope
00:02:42.620 --> 00:02:45.639
and reporting, although some companies like Anthropic
00:02:45.639 --> 00:02:48.659
supported the bill. Practical impact for listeners
00:02:48.659 --> 00:02:52.479
who hire vendors. Vendors you hire may need to
00:02:52.479 --> 00:02:54.860
show their safety reports and public frameworks,
00:02:55.360 --> 00:02:58.319
so teams can ask for them upfront. That shifts
00:02:58.319 --> 00:03:00.939
leverage to buyers who want proof, not vibes.
00:03:01.659 --> 00:03:04.139
There was a line from former California Supreme
00:03:04.139 --> 00:03:08.060
Court Justice Tino Cuilare that sticks. Totally.
00:03:08.659 --> 00:03:11.879
The Transparency and Frontier AI Act moves us
00:03:11.879 --> 00:03:14.379
towards the transparency and trust -but -verify
00:03:14.379 --> 00:03:17.289
policy principles. That's the frame. Open the
00:03:17.289 --> 00:03:20.770
black box a bit and check the claims. Okay, swing
00:03:20.770 --> 00:03:24.310
to Washington. Lama got the federal green light.
00:03:24.789 --> 00:03:27.629
Agencies can now procure tools built on Meta's
00:03:27.629 --> 00:03:30.449
Lama through normal general services administration
00:03:30.449 --> 00:03:33.870
channels. Lama is an open source model. Open
00:03:33.870 --> 00:03:36.310
source here means the model weights are published
00:03:36.310 --> 00:03:39.669
for reuse under a license. The upside is speed
00:03:39.669 --> 00:03:43.490
and lower cost. Teams can pilot faster and avoid
00:03:43.490 --> 00:03:47.150
being locked into one vendor stack. And the risks
00:03:47.150 --> 00:03:50.770
land on operations. Right. Open models widen
00:03:50.770 --> 00:03:53.610
risk if data handling and testing are sloppy.
00:03:54.090 --> 00:03:57.110
The moves for agency buyers are clear. Ask for
00:03:57.110 --> 00:03:59.849
a red team report. Red team means a structured
00:03:59.849 --> 00:04:02.509
stress test that tries to make the model fail.
00:04:03.009 --> 00:04:05.490
Lock privacy terms so the model doesn't train
00:04:05.490 --> 00:04:08.509
on your content. Require audit logs so you can
00:04:08.509 --> 00:04:11.759
trace decisions. and get a documented impact
00:04:11.759 --> 00:04:14.860
assessment that explains who's affected, what
00:04:14.860 --> 00:04:17.660
could go wrong, and the plan to fix it. All right.
00:04:17.699 --> 00:04:20.459
Health care centers for Medicare and Medicaid
00:04:20.459 --> 00:04:24.800
services and prior authorization. CMS will pilot
00:04:24.800 --> 00:04:27.879
an AI tool for prior authorization in traditional
00:04:27.879 --> 00:04:30.839
Medicare. Prior authorization is when a payer
00:04:30.839 --> 00:04:34.439
requires approval before care. The pilot is called
00:04:34.439 --> 00:04:38.569
Wizzr. It starts January 1 in six states, Arizona,
00:04:38.910 --> 00:04:41.990
Ohio, Oklahoma, New Jersey, Texas, and Washington,
00:04:42.350 --> 00:04:45.750
and runs through 2031. It will focus on services
00:04:45.750 --> 00:04:49.610
that see a lot of denials, skin and tissue substitutes,
00:04:49.949 --> 00:04:53.310
nerve stimulator implants, and knee arthroscopy.
00:04:53.670 --> 00:04:57.709
CMS promises some guardrails. They do. CMS says
00:04:57.709 --> 00:05:00.250
denials will get meaningful human review, meaning
00:05:00.250 --> 00:05:02.610
a person with authority must examine the case
00:05:02.610 --> 00:05:05.730
before the denial stands. They also say payments
00:05:05.730 --> 00:05:08.709
to contractors won't be tied to how many denials
00:05:08.709 --> 00:05:11.610
they produce. Even with that, lawmakers from
00:05:11.610 --> 00:05:13.709
both parties are pushing back and the House has
00:05:13.709 --> 00:05:17.110
moved to block funding in fiscal year 2026. The
00:05:17.110 --> 00:05:19.769
incentives matter. The incentives matter a lot.
00:05:20.230 --> 00:05:23.129
The pilot sits near shared savings logic. Shared
00:05:23.129 --> 00:05:26.029
savings means a contractor can earn more by delivering
00:05:26.029 --> 00:05:29.180
care for less money. That can tilt toward under
00:05:29.180 --> 00:05:31.579
-approval unless the rules are tight. Expect
00:05:31.579 --> 00:05:33.879
fights over what a real human review looks like
00:05:33.879 --> 00:05:36.839
and how fast appeals move for seniors and providers.
00:05:37.660 --> 00:05:40.879
State politics next, Metta's new SuperPAC. Metta
00:05:40.879 --> 00:05:43.819
launched the American Technology Excellence Project,
00:05:44.139 --> 00:05:47.480
a SuperPAC to shape state AI policy. A SuperPAC
00:05:47.480 --> 00:05:50.120
can raise and spend unlimited money as long as
00:05:50.120 --> 00:05:52.740
it doesn't coordinate with campaigns. Their message
00:05:52.740 --> 00:05:56.019
says there are 1100 plus state A .I. bills this
00:05:56.019 --> 00:05:58.680
year and that a patchwork could slow innovation.
00:05:59.180 --> 00:06:01.639
Expect them to back candidates who support light
00:06:01.639 --> 00:06:04.279
touch rules that favor big platforms. This brings
00:06:04.279 --> 00:06:06.620
to my mind a few months ago when Trump world
00:06:06.620 --> 00:06:09.620
floated a 10 year state freeze on A .I. laws.
00:06:10.040 --> 00:06:13.339
Yes. And even though that ban was eventually
00:06:13.339 --> 00:06:16.699
killed and reworked, that idea tells you who
00:06:16.699 --> 00:06:20.589
benefits when oversight is delayed. A long freeze
00:06:20.589 --> 00:06:23.490
would block states from setting safety, disclosure,
00:06:24.029 --> 00:06:27.410
or liability rules while platforms scale. Congress
00:06:27.410 --> 00:06:31.430
is stalled, so states are the live field. Money
00:06:31.430 --> 00:06:34.430
will flow to slow or shape those state rules,
00:06:34.529 --> 00:06:37.589
especially in big tech states. The through line
00:06:37.589 --> 00:06:40.790
is pretty consistent. It is. Governments are
00:06:40.790 --> 00:06:43.149
adopting AI faster, and the rulebook is being
00:06:43.149 --> 00:06:46.079
written under pressure. California set a transparency
00:06:46.079 --> 00:06:48.879
anchor. The feds cleared an open source path
00:06:48.879 --> 00:06:52.319
with llama. CMS is testing AI at the gate of
00:06:52.319 --> 00:06:55.019
care. Meta is funding candidates to keep state
00:06:55.019 --> 00:06:58.399
rules loose. Your choices as a buyer or advocate
00:06:58.399 --> 00:07:01.519
stick right in that current. Let's slow down
00:07:01.519 --> 00:07:05.879
and review for a second so teams can brief quickly.
00:07:06.259 --> 00:07:09.079
Great, and I'll keep it simple. First, for any
00:07:09.079 --> 00:07:12.399
AI tool, llama -based or not, demand four documents
00:07:12.399 --> 00:07:14.920
before you buy. a safety or standards framework,
00:07:15.360 --> 00:07:18.199
a recent red team report, privacy terms that
00:07:18.199 --> 00:07:21.100
ban training on your data, and an impact assessment
00:07:21.100 --> 00:07:24.560
with mitigation steps. Second, for work in or
00:07:24.560 --> 00:07:27.079
near healthcare, plan for prior auth friction.
00:07:27.660 --> 00:07:29.819
Share plain language guides with seniors in clinics,
00:07:30.300 --> 00:07:32.839
keep a template appeal ready, and track denials
00:07:32.839 --> 00:07:36.100
so you can escalate patterns. Third, for state
00:07:36.100 --> 00:07:38.899
policy fights, map the bills that touch disclosure,
00:07:39.240 --> 00:07:41.819
liability, and whistleblowers. Then prep talking
00:07:41.819 --> 00:07:44.459
points on why light touch rules shift risk to
00:07:44.459 --> 00:07:47.500
patients, consumers, and small orcs. Fourth,
00:07:47.740 --> 00:07:50.540
for California or states copying it, add a line
00:07:50.540 --> 00:07:52.800
to contracts that the vendor must deliver public
00:07:52.800 --> 00:07:55.540
-facing safety summaries and incident reports
00:07:55.540 --> 00:07:59.100
within set deadlines. Clean and doable. Totally.
00:07:59.519 --> 00:08:02.139
Templates save time. You collect once and reuse
00:08:02.139 --> 00:08:06.370
across vendors. Back to SB 53 for a moment. Annual
00:08:06.370 --> 00:08:08.589
updates by the Department of Technology create
00:08:08.589 --> 00:08:11.310
a moving target. They do, but it's a good target.
00:08:11.889 --> 00:08:14.550
Annual refreshes mean vendors and agencies align
00:08:14.550 --> 00:08:17.230
to a living standard. That reduces the risk that
00:08:17.230 --> 00:08:19.370
you buy a tool that looked compliant last year
00:08:19.370 --> 00:08:22.269
and fails you this year. Also, attorney general
00:08:22.269 --> 00:08:24.569
enforcement means the state can bring pain if
00:08:24.569 --> 00:08:27.009
firms ignore the rules. But industry pushback
00:08:27.009 --> 00:08:30.129
is baked in. For sure. Companies will argue that
00:08:30.129 --> 00:08:32.570
safety frameworks are trade secrets, that incident
00:08:32.570 --> 00:08:35.350
reports will be misread, and that Cal compute
00:08:35.350 --> 00:08:38.750
is expensive. The counter is simple. Transparency
00:08:38.750 --> 00:08:41.190
and shared infrastructure reduce systemic risk
00:08:41.190 --> 00:08:44.690
and improve access. California often sets patterns
00:08:44.690 --> 00:08:47.269
others follow, so this is the template fight.
00:08:47.950 --> 00:08:50.370
On llama and open models, the procurement muscle
00:08:50.370 --> 00:08:53.509
memory matters. It does. Open source isn't free
00:08:53.509 --> 00:08:56.090
of obligations. You still need security reviews,
00:08:56.490 --> 00:08:59.159
privacy gates, and support plans. Ask vendors
00:08:59.159 --> 00:09:01.259
who wrap llama how they patch vulnerabilities,
00:09:01.840 --> 00:09:04.200
how they monitor misuse, and how you can exit
00:09:04.200 --> 00:09:07.500
without losing your data and prompts. Open lowers
00:09:07.500 --> 00:09:10.320
cost and speeds pilots, but the basics still
00:09:10.320 --> 00:09:14.460
apply. On CMS, the on the ground story is delay
00:09:14.460 --> 00:09:17.679
and appeals. Right. Even with human review promises,
00:09:18.399 --> 00:09:21.100
seniors and providers can see more hoops. The
00:09:21.100 --> 00:09:23.460
target services, skin and tissue substitutes,
00:09:23.779 --> 00:09:26.620
nerve stimulator implants, and knee arthroscopy
00:09:26.620 --> 00:09:29.580
are areas where prior authorization already bites.
00:09:30.340 --> 00:09:33.639
AI can make initial screening faster, but a faster
00:09:33.639 --> 00:09:36.379
deny and appeal cycle is still a burden unless
00:09:36.379 --> 00:09:44.450
the review standard is clear and fair. And statehouses
00:09:44.450 --> 00:09:48.230
will hear a lot about innovation and jobs. Campaigns
00:09:48.230 --> 00:09:50.509
and advocacy shops should come with concrete
00:09:50.509 --> 00:09:53.470
harms. Deep fake election content. Deceptive
00:09:53.470 --> 00:09:56.870
ads. Bias decisions in benefits and data leaks.
00:09:57.429 --> 00:10:00.429
Pair harms with solutions. Disclosures, audit
00:10:00.429 --> 00:10:03.009
rights, whistleblower shields, and real penalties.
00:10:03.509 --> 00:10:06.190
That's how you avoid being painted as anti -tech
00:10:06.190 --> 00:10:13.919
while pushing for safety. It does, and it could
00:10:13.919 --> 00:10:17.059
help small orgs. Public compute can give universities,
00:10:17.399 --> 00:10:19.720
watchdogs, and smaller agencies access to the
00:10:19.720 --> 00:10:22.620
horsepower they can't afford alone. That's a
00:10:22.620 --> 00:10:25.399
public goods case, as long as privacy and access
00:10:25.399 --> 00:10:28.559
rules are tight and the funding is stable. Love
00:10:28.559 --> 00:10:31.460
it. Well, that's another episode in the bag.
00:10:31.740 --> 00:10:35.200
This is The Impact, brought to you by MF Strategies.
00:10:35.440 --> 00:10:39.679
Connect with the team at www .mfstrategies .com.
00:10:40.240 --> 00:10:43.110
I'm Addie. Somehow, see you next week on The
00:10:43.110 --> 00:10:43.509
Impact.