Dear Michael:
With this information alone, nobody can tell. There are just too many variables.
If you are speaking of the losses just of the feeding mechanism of two otherwise identical half-wave dipoles:
a) for a center-fed dipole, it is mainly the feedline loss, which depends on the length and type of feedline (in dB/100m from the datasheet), e.g. RG174 vs. RG58.
b) for the end-fed, it depends on the design of the matching circuit. A well-designed autotransformer can reach ca. 85+% efficiency, poor designs or poor builds less than 25%.
Both will also have additional losses if there is an impedance mismatch.
If built properly and deployed in identical set-ups, an end-fed vs. a center-fed dipole will behave so similarly in practice that no receiving station will be able to distinguish.
If you are talking of completely different antennas, it is just impossible to make a valid statement because of too many variables.
The differences between most popular SOTA antenna designs are less than ca. 6 dB (typically much less, if we leave out flawed designs or very short ones). Even this is hardly relevant in S-levels, although 6 dB (= one S-level) less means a 75% weaker signal (10^(-0.6) = ca. 0.25).
HB9SOTA did a substantial comparison of SOTA antennas in a field test back in 2017:
For most antennas tested, the difference was less than 0.5 S-levels.
That is all one can say at this level of abstraction. You can built and try - and this is what many hams consider an essential part of the fun. But since the outcome will depend on multiple variables (see above), such a single comparison will also be just of anecdotal value.