- Reducing RFI from the Tesla Powerwall 2
- Does the Tesla Powerwall 2 produce RFI (Radio Frequency Interference)?
This time, I do have a tale to tell about radio frequency interference (RFI) to my Powerwall 2 system.
The history:
Years ago, because it was convenient, I placed my DSL modem in my ham shack. Other than low-level spurs from the modem's plug-in power supply (wall wart) which were easily solved, it never caused any problems - nor would my HF operation seem to bother my DSL modem - except on 160 meters when I ran more than about 50 watts. I found this remarkable because the wire from the DSLAM (the distant interface from the phone company) came through the window only about a foot away from the windowed transmission line that carried the transmit RF power to the antenna. At some point I dropped the POTS (Plain Old Telephone Service) dial-up in lieu of VOIP and at the network interface (the box outside the house) I disconnected the internal house wiring as it was no longer needed, back-feeding this internal wiring from the VOIP box to allow the continued use of the phones.
Several years ago I got a linear amplifier capable of full-legal power (1500 watts in the U.S.) to use on the HF bands to assist when conditions were poor and on some frequencies this high-power operation would cause intermittent drop-outs in Internet connectivity when I transmitted. It seemed that the longer I operated, the fewer these drop-outs were, possibly due to the modem "re-training" itself to deal with the (apparently) degraded connections.
When I got the Powerwall its Ethernet connection came through the outside wall and into the house near the DSL modem as it was a convenient place to make the connection - and a UPS was nearby. I recently decided to relocate the DSL modem to another room, one farther away from the ham shack and closer to where the underground wire from the telephone company came into the house and this involved a bit of additional wiring of Ethernet cable, and since my entire house is effectively on a UPS (via the Powerwall) it didn't matter where I plugged it in now.
Without the DSL modem in the shack I installed another Ethernet switch to manage the multiple connections that were made at that point: One to the shack computer, another to the garage's Ethernet (to provide Internet connectivity of the solar inverters), another to a KiwiSDR and yet another to another switch where even more things were connected. Also on this switch is the Ethernet connection to the Powerwall.
On the blink:
I'd done the relocating of the networking gear earlier in the afternoon about a week and a half ago and it wasn't until that evening when I got around to tidying things up slightly and putting a transmitter on the air - in this case, my 630 meter (472-479 kHz) station - to make a few contacts. When I keyed the transmitter - which produces about 75 watts of RF - the lights in the house dimmed, a UPS beeped and the lights went bright again - so I quickly un-keyed.
After swearing to myself and hoping that this was a coincidence I waited for a minute or two and tried again - with the same result: The power flickered, the UPS beeped and the lights went back to normal. Bringing up the Tesla app I looked at the Powerwall's back-up history and saw that I now had two outages, each less than 30 seconds: That in itself was unusual because the Powerwall typically stays off the grid for a couple minutes even after utility power returns to make sure that the it is stable.
"$#!+", I thought to myself again!
I then powered up the HF station. Things were OK on 75 meters at 100 and 1500 watts and things were also OK on 40 meters at 100 watts - but I triggered the same "blink" response at any power level above about 600 watts.
Now began the methodical investigation. The first thing that I did was to disconnect the Ethernet connection to the Powerwall from the switch that I had just installed: No problems at all on any frequency or power level.
This was getting interesting: Why would connecting the Ethernet cause a problem?
Ethernet connections are supposed to have galvanic isolation via a transformer! To be sure, there is a small amount of capacitive coupling between the two windings, but this was on the order of a few 10s of picofarads - not nearly enough to cause a problem at 630 meters - or so one would think!
I then grabbed another Ethernet switch - a small, in expensive Linksys 5-port switch and connected everything to it: No problems.
At this point I may have been able to get away with using that small, cheap switch, but it was very old and it was "only" a 100 Mbps-capable switch which meant that it would be a bottleneck for traffic between computers. I was also determined to make it such that I it would not matter to what I connected the Powerwall's Ethernet cable as I wished to avoid a future problems should I forget why it was there!
When shielded cable doesn't help:
A bit of investigation revealed that the CAT-5e cable to the Powerwall was shielded. One might first think that this should have solved the problem - but you'd be wrong:
Shielding is useful for containing energy within the cable and prevent it from radiating, but if the cable in question is, itself, longitudinally conducting RF energy from one place to another, this shielding has absolutely no useful effect on its own! In other words, if a piece of equipment at one end is somehow allowing RF to be induced onto the Ethernet cable, shielding will do nothing at all to prevent the conduction of RF to the far end - and it may well make it worse!
Still, the shield might prove useful to allow shunting some of the RF current on the shield so out of due diligence I went outside to the electrical raceway below the Powerwall where the conduit - which is all metallic and bonded to everything else, including the Powerwall - that conveyed the Ethernet cable from the house. There, I carefully opened the outer jacket of the cables and made a connection to the shields which I then bonded to the metal raceway as depicted in figure 2.
It didn't.
No surprise there.
I decided to get serious about the problem and started checking the ferrite devices that I'd previously installed. Winding a few turns on each I measured the inductance and found that they did provide a reasonable amount of reactance at 40 meters - typically 5-10 microHenries which provides between 200 and 400 ohms of impedance at 7 MHz - but clearly, this was not enough as I was still having problems at 7 MHz. This would be especially true at 630 meters where the reactance would be only a few 10s of ohms - hardly enough to significantly impede RF energy at that frequency. I then began to rummage about through my collection of ferrites, running a few turns of wire through each.
It became clear that the devices that I had previously used were typically of "Mix 43" or similar, best-used for the higher HF frequencies (and into VHF): A few turns through one of these will give 10-ish microHenries of inductance, but I needed far more than this so I switched my attention to finding "Mix 75" and "Mix 77" devices - ferrite material that had an order of magnitude or so more permeability and would also yield much higher impedance: A few turns on these would yield the hundreds of microHenries and offer several kilo-ohms of reactance to better-block RF - especially at 630 and 2200 meters.
What finally worked:
What I settled on was a combination of several things:
- Several turns of the Ethernet cable through some Mix 75 snap-on chokes. The use of several chokes and several turns maximizes the added inductance. (see figures 2 and 3.)
- Some more of the same snap-on chokes were placed over the RS-485 connection to the Neurio in the garage. There wasn't evidence that RF energy was affecting this line, but I didn't want to take the chance.
- As depicted in Figure 4 I wound a 6 foot length of flat CAT-6 Ethernet cable over a toroidal (ring) core of Mix 75 ferrite to provide a choking inductance of several milliHenries - several orders of magnitude higher inductance than the other ferrite devices. This "bulk" inductance would be responsible for blocking RF energy at the 630 and 2200 meter bands at which I often operate.
- Inside the raceway I placed some additional Mix 75 snap-on devices over both the Ethernet and RS-485 cables, each passing through the cores several times for maximum inductance.
By adding all of this ferrite I increased the impedance (at RF) of the common-mode current to hundreds (if not thousands) of ohms at all of the frequencies on which I am likely to operate. It is to this effort that one must go to minimize conducted RF energy on such conductors.
Why did it matter which Ethernet switch I was using? I'm not sure, but I suspect that the Gig-E switch is lacking complete galvanic isolation on at least some of its Ethernet ports. This issue could occur not only if there is a shielded Ethernet cable with attached shielded connectors on each end - which would be a liability in this particular case - but also if there is some sort of balanced connection to the pairs themselves - such as would be present if the switch had POE (Power Over Ethernet) capability. The manual for the switch indicates no POE capability, but there is definitely something different about it and the way it connects the cables! Perhaps a similar model of this switch does have POE and there is, in fact some connection inside - but unless/until I tear it down to find out, I won't really know.
Afterward:
Several days after this event I got a telephone call from Tesla Powerwall support asking me to remove the sticker from my back-up gateway and see if the "pin" was visible. When queried, the representative noted that they had recorded several "incidents" with my system - and time and dates of these correlated exactly with my RF interference issues: I told her that as far as I was concerned, the issue was resolved - but they still insisted that I take a picture of the pin and forward it to them - See Figure 5.
Update: I was contacted again by Tesla - and this time they asked "if I was able to move the pin". Not having been asked to do this before, I did so when I was at home again and it did move: To the right, the house was isolated from the grid and to the left, the house is tied to the grid.
Having forwarded this information, I have yet to hear back.
Parts sources for ferrite devices:
There are several sources of snap-on ferrite devices described on this page, including:
- KF7P Metalwerx - link - Supplier of a variety of ferrite devices and many other things. At the present time he stocks the "Mix 31" devices, but does not stock "Mix 75" snap-on cores at the time of posting, but he does have ferrite rings of both ferrite mixes.
- Mouser Electronics - link - The "Mix 31" snap-on cores - P/N: 623-0444164181 (Fair-Rite P/N: 0444164181); "Mix 75" snap-on cores - Mouser P/N: 623-0475164181 (Fair-Rite P/N: 0475164181). Mouser Electronics has other sizes and mixes of these various devices, including toroids (rings).
Links to other articles about power supply noise reduction found at ka7oei.blogspot.com:
- Containing RF noise from a "pure" sine wave UPS. Even when it is not operating your sine wave UPS may be producing a lot of HF radio interference!
- Completely containing switching power supply RFI - link. Sometimes it can be difficult to quiet a switching power supply, so it may be necessary to put it in a box with strong filtering on all of the conductors that enter/leave.
- Minimizing VHF (and HF) RFI from electronic ballasts and fluorescent tubes - link. Electronic light ballasts, like many switching power supplies, operate in the LF frequency range so "cleaning them up" at VLF/LF/MF frequencies can be a challenge.
- Quieting high current switching power supplies used in the shack - link. This page describes techniques that can be used to reduce the amount of RF energy produced by switching power supplies that you may be using to power your radios. Again, higher-inductance chokes may be required at VLF/LF/MF frequencies.
- Reducing switching supply racket - link. This describes techniques that can be used to beef up the filtering for switching supplies in general.
- Teasing out the differences between the "AC" and "DC" Powerwalls - link. In this post I discuss generally how "AC Battery" systems like the Powerwall and how they work as well as the general differences between the so-called "DC" Powerwall and the AC Powerwall.
- The solar saga part 1: Avoiding Interference (Why I did not choose Microinverters) - link. Having had first-hand experience observing a microinverter-based PV system, I discuss why I went the route of the series-string inverter.
- The solar saga part 2: Getting the system online - link. Like most large projects something intervenes that it makes it take longer to complete - and that was the case here, but it was successfully completed... eventually!
This page stolen from blogspot.ka7oei.com
[End]