• Welcome to The Building Code Forum

    Your premier resource for building code knowledge.

    This forum remains free to the public thanks to the generous support of our Sawhorse Members and Corporate Sponsors. Their contributions help keep this community thriving and accessible.

    Want enhanced access to expert discussions and exclusive features? Learn more about the benefits here.

    Ready to upgrade? Log in and upgrade now.

Big Change to 2023 NEC - 10A Circuits?

jar546

CBO
Joined
Oct 16, 2009
Messages
12,819
Location
Not where I really want to be
Please read and comment?

210.18 Rating
Branch circuits recognized by this article shall be rated in accordance with the maximum permitted ampere rating or setting of the overcurrent device. The rating for other than individual branch circuits shall be 10, 15, 20, 30, 40, and 50 amperes. Where conductors of higher ampacity are used for any reason, the ampere rating or setting of the specified overcurrent device shall determine the circuit rating.

Exception No. 1:
Multioutlet branch circuits greater than 50 amperes shall be permitted to supply nonlighting outlet loads in locations where conditions of maintenance and supervision ensure that only qualified persons service the equipment.
Exception No. 2:
Branch circuits rated 10 amperes shall not supply receptacle outlets.
 
It's kind of a nothingburger in IRC world since you still need 14 AWG wire and I do not think they make (or are likely to start making)10 amp AFCI breakers. I have heard they are developing a standard to use 16 AWG wires in subsequent code cycles, which could be of benefit, but as of now I don't think it's likely to be very useful.
 
Could this be a way to compensate for voltage drop on a normal 15amp circuit?

Scenario: House built with normal 15-amp circuits and 14-gauge wire, but the back bedrooms are far enough away from the panel that there are voltage drop issues.

Solution: Instead of pulling new 12-gauge wires to accommodate 15-amp circuits, just down-size the breaker to 10 amp. Problem solved?

Just spit-balling, go ahead and tell me why this is such a dumb idea. ;)
 
Could this be a way to compensate for voltage drop on a normal 15amp circuit?

Scenario: House built with normal 15-amp circuits and 14-gauge wire, but the back bedrooms are far enough away from the panel that there are voltage drop issues.

Solution: Instead of pulling new 12-gauge wires to accommodate 15-amp circuits, just down-size the breaker to 10 amp. Problem solved?

Just spit-balling, go ahead and tell me why this is such a dumb idea. ;)
Lighting circuits is what I was thinking, not outlets.
 
Changing the breaker size will not affect voltage drop at all in relation to the operation of the lights. The voltage drop is a function of the conductivity of the wire over distance, so the only way to reduce voltage drop is to shorten the wire, upsize the wire, use a more conductive material for your wire, or up the voltage at the transformer (although that doesn't really change the voltage drop, it just compensates for it).

The only time voltage drop might affect a smaller breaker is that at a lower voltage you will pull more amps for the same load. If you load a circuit to precisely it's full capacity, theoretically you might be able to extend circuits far enough to reduce the voltage just enough to pull just a little bit more amperage and get yourself into the zone where the breaker might trip. If you are loading your circuits that close to full capacity though, you should probably be using a bigger wire anyway.
 
Changing the breaker size will not affect voltage drop at all in relation to the operation of the lights. The voltage drop is a function of the conductivity of the wire over distance, so the only way to reduce voltage drop is to shorten the wire, upsize the wire, use a more conductive material for your wire, or up the voltage at the transformer (although that doesn't really change the voltage drop, it just compensates for it).

The only time voltage drop might affect a smaller breaker is that at a lower voltage you will pull more amps for the same load. If you load a circuit to precisely it's full capacity, theoretically you might be able to extend circuits far enough to reduce the voltage just enough to pull just a little bit more amperage and get yourself into the zone where the breaker might trip. If you are loading your circuits that close to full capacity though, you should probably be using a bigger wire anyway.
Right. If you do the calculations ahead of time and realize there will be a voltage-drop issue, then a good electrician would upsize the wire. For example, they may use 12AWG copper on a 15amp circuit. But if the house was already wired with 14 gauge, they could downsize the lighting circuit to a 10amp breaker. Effectively, they would be upsizing the wire for a 10amp circuit.

Again, I was just spit-balling a random example of why they listed a 10amp in the section the OP quoted.
 
Right. If you do the calculations ahead of time and realize there will be a voltage-drop issue, then a good electrician would upsize the wire. For example, they may use 12AWG copper on a 15amp circuit. But if the house was already wired with 14 gauge, they could downsize the lighting circuit to a 10amp breaker.
If you have a branch circuit with almost 10A of load, and you find the voltage drop to be marginal (acceptable but just), then I guess it could make sense to change the breaker to 10A to suggest that no one should add more load to that circuit. But otherwise, voltage drop has nothing to do with breaker size. It just depends on the load, the circuit length, and the conductor size.

Cheers, Wayne
 
That is very narrow-minded and fundamentally wrong.
I don’t understand that. Since the breaker is at the supply end of the circuit, you don’t see any drop until you add some length of wire into the circuit and measure at the far end.
 
That is very narrow-minded and fundamentally wrong.
Hey y'all, I apologize for how that came off. It was not my intention to be a jerk.

The statement was that voltage drop has nothing to do with breaker size. What I was pointing out is that there is absolutely a connection between the two. We tend to see thing top down, like we look at a set of plans.

Voltage-drop is a function of load. If you have a 5-amp load on a 12-gauge copper wire, no problem, even if the distance is relatively far. As you increase the load the relative (or potential) voltage-drop becomes more of an issue.

The role of a breaker is to protect the wire from becoming overloaded. The size of the breaker is relative to the size of the wire under normal operating circumstances. There are scenarios where one might derate a breaker for functional purposes, such as adding a large PV load to a panel.

I saw this statement:
If you have a branch circuit with almost 10A of load, and you find the voltage drop to be marginal (acceptable but just), then I guess it could make sense to change the breaker to 10A to suggest that no one should add more load to that circuit. But otherwise, voltage drop has nothing to do with breaker size. It just depends on the load, the circuit length, and the conductor size.

Cheers, Wayne
My response was knee-jerk and not well thought out. For that, I apologize.
 
Hey y'all, I apologize for how that came off. It was not my intention to be a jerk.

The statement was that voltage drop has nothing to do with breaker size. What I was pointing out is that there is absolutely a connection between the two. We tend to see thing top down, like we look at a set of plans.

Voltage-drop is a function of load. If you have a 5-amp load on a 12-gauge copper wire, no problem, even if the distance is relatively far. As you increase the load the relative (or potential) voltage-drop becomes more of an issue.

The role of a breaker is to protect the wire from becoming overloaded. The size of the breaker is relative to the size of the wire under normal operating circumstances. There are scenarios where one might derate a breaker for functional purposes, such as adding a large PV load to a panel.

I saw this statement:

My response was knee-jerk and not well thought out. For that, I apologize.
I never try to be a jerk either, but I am, so it comes out that way sometimes....lol
 
Voltage-drop is a function of load. If you have a 5-amp load on a 12-gauge copper wire, no problem, even if the distance is relatively far. As you increase the load the relative (or potential) voltage-drop becomes more of an issue.

The role of a breaker is to protect the wire from becoming overloaded. The size of the breaker is relative to the size of the wire under normal operating circumstances. There are scenarios where one might derate a breaker for functional purposes, such as adding a large PV load to a panel.
OK, the size of the wire determines the maximum OCPD that can be used to protect it, and hence a maximum for the calculated load that can be on the circuit. And if you have a voltage drop limit, then the size of the wire and the length determines another maximum actual load that can be on it. [Calculated load should be greater than or equal to actual load, as the calculations are intentionally conservative.]

So the connection is that they are both related to wire size. Still, seems like two separate limits; in sizing wires, you have to comply with both. If you have a voltage drop issue for a given loading, wire size, and length, changing the OCPD size isn't going to help. You would have to change one of the variables of allowable voltage drop, actual loading, wire size, or length.

Cheers, Wayne
 
Voltage-drop is a function of load. If you have a 5-amp load on a 12-gauge copper wire, no problem, even if the distance is relatively far.
Let's work out an example based on this, 5A of actual current draw. Per NEC Chapter 9 Table 9, #12 copper has an AC resistance of 2 ohms/kft, and I'll ignore the reactance. And let's say this is a 120V 2-wire branch circuit, with the actual voltage at the panel supplying the circuit exactly 120V under worst case loading of the rest of the wiring system.

If our load has only a 5% voltage drop tolerance, then we need less than 6 volts of voltage drop. At 5 amps, that means less than 1.2 ohms of round trip resistance, or 0.6 ohms of one-way resistance. At 2 ohms/kft, that's 300 ft. If we want to go any farther, we'll need to upsize the wire or change one of the design requirements. Still, we can protect the wire at 20A, as it is #12 Cu.

Alternatively suppose our load has a minimum voltage of say 80V. E.g. a gate opener for a distant gate, which is based on a 12V battery and a battery charger to maintain the battery. Now the allowable voltage drop is 40V, and at 5 amps, that's 8 ohms resistance round trip, or 4 ohms one way, or 2000 ft at 2 ohms/kft. If we want to go farther, we'd need to change one of the parameters. [Unless the gate is operating many many times a day, we probably don't really need a 400W charger to maintain the 12V battery, so the easiest change in this example would be to lower the load current to maybe 1A or 2A.] At 2000 ft, 1/3 of the energy delivered by the circuit would be lost as heat within the circuit (when operating at the full 5A), but as long as we are OK with that, and there's no energy code saying that's not allowed, #12 is still a functional choice.

Now this does raise an interesting question: at 2000 ft of #12, with an 8 ohm round trip resistance, the short circuit current at the end will be only 15A. Does the NEC allow use to protect the circuit at 20A as normal? If there's a fault at the end, it won't resolve, we'll just have a 2000 ft long 1800W heater indefinitely. Maybe that's an acceptable failure mode. If not, we better use a 10A OCPD, which should eventually trip at 150% of rated current. [Might not trip during the winter if the OCPD is outside, maybe a 6A fuse is better for that case.]

So at this extreme distance, we do see a possible influence of voltage drop on the choice of OCPD size.

Cheers, Wayne
 
Is it so hard just to admit you were wrong?
You still haven't explained how there's a connection. Using a smaller OCPD in no way "compensates" for voltage drop (post #5).

Cheers, Wayne

P.S. Happy to admit I'm wrong when it is in evidence.
 
Last edited:
Basically, if you choose to limit the loading on a 20A branch circuit (20A OCPD, #12 conductors) to 15A because of voltage drop considerations, there is no requirement to also downsize the OCPD to 15A. Voltage drop limits and OCPD size limits are independent considerations.

Cheers, Wayne
 
Which is never, I presume. No thanks, I'm honestly not interested in explaining it to you.
A bit perplexed about this interaction. From my point of view, you made an assertion; I disagreed; you attacked me; then you repeat the assertion without any elaboration or support; I further disagreed, along with some elaboration and explanation; and you criticize me again and say you're not interested in explaining.

Anyway, perhaps your thinking is this: normally breaker size and conductor size go hand-in-hand, one determines the other (roughly true). When there is a voltage drop concern, this relationship is broken: you end up with a larger wire than usual for the given breaker size, which is equivalently a smaller breaker than usual for the given wire size. [Please explain if I'm mistaken as to what you mean.]

But this should be seen as upsizing the wire, not downsizing the breaker. The minimum breaker size is determined by NEC 215.3 / 210.20 and just depends on the load served, not the distance to the load. The wire requires a more complicated process to size (relatively simple for 1-3 CCCs and 30C ambient), and that sizing may involve a voltage drop check.

Cheers, Wayne
 
Back
Top