This story is based on the short story: "The Trial of Thomas Jefferson" (https://web.archive.org/web/20161227175444/http://www.davidbarrkirtley.com/thetrialofthomasjeffersonbydavidbarrkirtley.html)
***
One day, you discover a time machine that lets you set the rules for its usage. It does not allow history to be changed, but is set up to create alternate timelines where the evildoers of history may be punished. According to the time machine, you have the privilege of doing so as a result of being the first to walk upon it in a thousand years.
The time machine must be used to bring evildoers to justice. Your first thought is to define a wrong as something which is considered wrong by society today. Then you realize that there are perhaps a dozen particularly divisive issues which society has not yet come to a consensus on. You linger, perhaps for a bit too long, on the thought of setting the rules to reflect your personal stances on those dozen particular issues, and ultimately decide that that it is quite unlikely that your personal stance on all those issues are all correct.
Then you think of more abstract rules. Maybe you let the rules change as society changes, reflecting what the majority of society at the time agrees on. You fiddle with the proportion a bit.
One of your friends calls you. They believe that a much higher percentage of society must agree before an intervention into the past is to be made. They argue that there are times when the minority is indeed correct. They also emote that the time machine should be used sparingly, because ethics is complicated, and that such a powerful tool should therefore be restricted. They beg you to at least require a supermajority, if not unanimity.
Another one of your friends calls you. They believe that a much lower percentage of society must agree before an intervention into the past is to be made. They argue that people are very different, and that even getting 50% of the population to agree on anything is difficult, because there are more than two stances on any given position, much more for the combination of all the positions. They point out that the time machine should be used as much as possible, because to not do so is to tacitly endorse the crimes of past humans through inaction. They beg you to lower the proportion required to at least the size of the smallest politically cohesive demographic, if not individual people.
A third friend calls you. They believe that the whole idea of societal consensus is nonsense. They point out that the popularity of an idea is unrelated to its veracity. Some bad ideas are popular, some good ideas are unpopular, and letting the multiversal justice system be based on the fluctuations of popularity is a bad idea. They suggest instead to only punish people for not coming up with ethical conclusions that it would be possible for them to come up with. Thus, if for some reason if people in the future thought it was unethical for people to make origami, then they would be unable to punish people in the past, because people in the past would have no way of coming up with that by themselves. They point out that this hypothetical society which hates origami could punish people in the past if you take the suggestions of the second or even the first friend.
The first friend calls you again. They think the third friend's system is too relaxed actually, because it is possible for individual people to come to any number of wild conclusions. They suggest as a compromise that people in the past must allow themselves to be judged by future people for a specific ethical conclusion before people in the future are allowed to judge them. That way, the origami situation could only happen with the consent of people living in the past.
The third friend calls again to reply. They think that the first friend's compromise still fundamentally relies on the whims of societal consensus far too much. They demand that only people exposed to the arguments that lead to the societal consensus in the first place may be judged by it. Otherwise, they argue, people might be punished for the crime of ignorance.
The first friend hesitantly agrees, but expresses a worry that people will isolate themselves in order to avoid hearing arguments for conclusions which they do not agree with. They suggest that an effort is to be made to make everyone hear the arguments before a vote is taken. They decide that perhaps a smaller proportion of people will be necessary to begin the process of spreading around the arguments for the actual deciding vote.
Just as the first friend is about to agree, the second friend calls you again. They realize that they are about to be outvoted, and focus in on their demand to remove the restrictions on the use of time travel. They point out that there exists a fundamental trade off between restricting malicious actors and empowering benevolent actors. They argue that in general, people in the future are more moral than people in the past, and therefore that people in the past should try not to restrict the actions of people in the future. They point out that people abusing the time machine in the future can be punished by people even further in the future. They claim that no matter what system of restrictions the other two friends make, they will never be able to overcome that fundamental trade off. They then repeat their confidence in the fact that people in the future will be more moral.
The first friend casts doubt on whether or not people in the future are actually more moral or not. They point out that even if we in the present are more moral than people in the past, this does not immediately suggest that people in the future will also be more moral. There are no grounds for extending the trend into the future.
The third friend then argues that, even if the second friend is right that people in the future are more moral, it is still a good idea to hedge the bets. There is no way that going all the way to one extreme of the trade off does not have diminishing returns. It is not impossible to tell apart good people from bad people, therefore having no restrictions is a definitely a bad idea. It is still a good idea to have basic restrictions at the very least.
The first friend suggests a compromise. Revolving around a system with many restrictions initially that can be lifted through sufficient consensus.
The second friend rejects the compromise, on the grounds that the compromise is still a restrictive system.
The third friend also rejects the compromise, on the grounds that the compromise will quickly turn into the second friend's system, because good people and bad people both believe that they are good.
A fourth friend calls you. They disagree, on principle, with the idea of retributive justice. They say that the creators of this time machine were pretty cruel in only allowing it to be used for punishment, and thus for suffering. They are concerned that the other three friends are focused in far too much on punishing the right people, and not realizing the other ways they could be using the time machine for good. They suggest instead to use the time machine to create alternate timelines of bountiful happiness, and thereby slightly increase the amount of pleasure in the multiverse. You politely point out that the time machine was not designed for creating happiness to your fourth friend, and throw them out of your thought experiment.
You hear the other three friends murmur in the background.
You propose to your friends that whether or not people in the future are actually better can be empirically tested. You point out that because time travel exists, it does not matter how long it takes to make the rules of the time machine, as long as they are made eventually. The people to be punished will still be there. You make it so that the specifics of the rules of time travel can be decided in a thousand years. You also attach a general summary of what the moral positions of current society are. Before you submit your rules, you consult your friends one more time.
All three friends want the time to be extended past a thousand years. Each of them point out in their own way, that taking your logic to its conclusion, requires delaying as long as possible.
You, however, keep the time at a thousand years. It seems ... right, for some reason you can't quite articulate. Before you submit the rules, you decide add a transcript of the conversation you had with your friends.
The time machine disappears.