Tokenomics Proposal: Replacing the Bonding Curve

in the following website that I’m posting below, you can take a look at the share buybacks that Allianz has done in the last few years. As I am sure you know, Allianz is a giant in the insurance business, and I am sure that you know as well that a share buyback implies buying shares of the company in the stock market with their own capital pool. Since Allianz is trading above book value, as opposed to Nexus, it means that any buyback reduces the book value per share. It includes details of the current existing program and of all of the past buybacks, which in total amount to more than $10B since 2017.

https://www.allianz.com/en/investor_relations/share/share-buy-back.html

I guess this fact absolutely and completely invalidates your premise that in the insurance business, buybacks only happen below book value. And also your other comment that in the insurance business, the book value is the correct measure of success, because if that was the case, there would be no explanation for one of the biggest insurance companies in the world to use this tool, which reduces book value.

2 Likes

I also said:

I don’t find the argument that X large insurer did this, so that’s automatically a good approach and we should replicate it. Allianz trades at a little premium to their book value, as do some other large insurers, particularly if they have valuable consumer facing auto / home / health policies with access to consumers directly.

They also hold a large amount of real estate and have a large asset management business. So, it’s not obvious to me that you can say that Nexus Mutual is in a similar position to Allianz, and that we should just copy what they are doing.

Clearly, they believe that their business is undervalued, or, they are just treating this as an efficient way to distribute excess capital beyond their typical dividend.

Fairly confident that you’ve made comments before asking whether the mutual should dissolve, which would mean releasing capital at book value. You can’t have it both ways, you can’t say that the mutual should dissolve and release capital and that we believe that our true value is greater than book value, and so should buy shares above book.

By making the argument that we should be releasing capital, in any form at all, you’re making the argument that the mutual cannot use the capital more efficiently in the business, and so should instead release it back to investors. If that’s the case, given how small our outstanding exposure is, how can you simultaneously believe that we should say true value is above book value? What insurer with less than $200m in exposure is worth more than book value, when it’s not printed massive YoY growth?

It’s pretty obvious to me personally that the fair true value of Nexus Mutual right now is not significantly above book value. If and when we begin to grow our exposure and increase profitability, that true value should rise by accruing profits and driving book value up. But also, by deserving a premium to book value, because we’ve shown strong growth that justifies paying above book. At which point, if you can predict strong profitable growth going forwards, you can make an argument that we should programatically be buying tokens above current book value. But we are not there today.

To be clear, because I’m anticipating that this will be spun, I’m not suggesting that the value of Nexus Mutual today is ONLY the assets we hold. I’m suggesting that it’s not significantly above that, to the point where we should permanently introduce a tokenomics system which will blindly buyback tokens far above book value, when there are countless other factors that should dictate whether this is a good idea or not.

Once your system is in place, we would need to believe that there is not ever a situation where the mutual should only ever be valued as a function of outstanding cover, which seems obviously to be untrue. Your argument is that book value cannot be used solely to decide whether to buy tokens at a value, but instead, your system uses only cover outstanding to decide…

Imagine a scenario where the market is aware that enormous claims are upcoming, but they are not yet filed. In which case, the true value of the protocol should likely be below the book value at that time, as in this example there’s a belief that 20-30% of the assets will actually be paid out in claims shortly, and then revenue will also drop because with less assets our other policies must shrink. Clearly, just using cover outstanding to programatically buy back tokens from the market here is horrible. Not only do we take a big loss on the claims, but we double down by then buying tokens from the market programatically at an excess value. Whereas a rationale operator would halt any buybacks at that point, reassess based on the new asset value and cover sales post-claims, and decide what the new value for buybacks should be.

Rei’s system has a similar flaw, but has the benefit of having a ratcheting system which goes down. So, when the market incorporates this information into wNXM, sells will come direct to Nexus Mutual, and the price ratchets down to reflect the sell pressure so that we aren’t paying so much in excess of the new perceived “true value”.

Should also note I didn’t say that every insurer trades at or below book. I said that book value drives the market cap, which is broadly true. Depending on niche you might find that property insurers are closer to 2-2.5 P/B, while life insurers very often less than 0.8 P/B. That’s a function of profitability, growing market segments and less efficient policy pricing.

Keep in mind, you’re also only look at the largest, most profitable insurers in the industry. Nexus Mutual was unprofitable in 2022 and has 1/10,000 of the assets of Allianz. They are not even remotely comparable.

When you look at more comparable insurers, P/B is a better measure.

If you’re going to look at one of the most profitable insurers, with diversified risks, an extremely strong brand, a real estate and AM business and a unique access to customers, no, P/B is not a measure that you can rely very strongly on for whether to do a buyback or not.

Your suggested system is completely blind to the factors which would dictate whether an insurer should be valued at a premium to book value, yet it automates issuing of new tokens and buyback of existing tokens. Instead, Rei’s system which automates this but only at around book value prevents this system from taking massively suboptimal actions.

If we wish to do larger buybacks, above book value, we can do that manually via vote if and when we choose.

In our current position, we can likely execute fairly significant buybacks purely one-off, by buying wNXM on the market. There is no need to incorporate this directly into the tokenomics, which will exist for years going forward. We are on the same page with regards to doing a buyback right now. Yes, we should do that, at the right price, which is the current wNXM market price. I disagree that this needs to be programatically forced into the tokenomics, because under many scenarios it becomes net negative.

I’m anticipating that your response to this will spinoff into whether the mutual should dissolve or not. If it does, I’d kindly request that you do that in a separate thread. Your post here is about whether we should use your proposed tokenomics, let’s keep it focused on that only.

2 Likes

Thanks Dopeee.

A pros/cons list for my proposal as I see it:

Pros:

  • Unlocks capital pool for member exits
  • Pulls price up towards book value (“BV”) over time if sufficient liquidity is provided
  • Closes wNXM/NXM gap, and therefore makes NXM price used in protocol for covers/claims market-consistent
  • Book value increases as a result of each buy/sell by definition
  • The mutual can be a price-setter or price-taker depending on parameterisation, including having different positions above and below BV. For example, we can provide high liquidity for exits to set a strong price floor, but have low target liquidity in the pool above BV to closely follow the market.

Cons:

  • Redemptions from protocol disabled above BV and buying NXM from protocol disabled below BV, so capital only removed/captured through sells/buys in those ranges. Note that members can still enter/exit through wNXM and wrapping/unwrapping.
  • Price anchoring to book value from above, although can be tempered by parameterisation.

A comment on the timeline, as been getting lots of feedback on that - aim is to have something live around mid-year, but could be delayed by the amount of engineering effort required, gas cost optimisation, audit availability, etc.

Will provide thoughts with the appropriate amount of detail on the alternative proposal & discussion above after the weekend.

3 Likes

Well, many points to make:

  • First thing, important one: You keep talking about “my” proposal. And it is not mine, it is Hugh Karp’s and Reinis Melbardis’. They wrote a white paper with it, and built Nexus Mutual in its image. And under this proposal is under which all of the token holders who ever bought NXM token in the Nexus Platform bought from and into. Everybody bought at Market price (market price, not book value) as defined by the bonding curve, and the ones who sold, sold to the mutual at market price (not book value) minus a spread. So it is important to keep in mind that I am not proposing any change, in fact I am proposing to keep the design as it was defined in the first place, planned and implemented.
  • It is absurd to discuss with anybody who continuously moves the goal posts. First it was, “buying above book value makes no sense, nobody would do that”, and I said that buybacks do exactly that. Then it was that “ok, but only companies like Apple would do it, because they are valued differently, but no insurers”, and I said that Allianz does, and now it comes that “ok, but Allianz is not an insurer that Nexus can compare itself with”. How much further away are the goal posts going to go?
  • By the way, Allianz also distributes dividends, on top of performing stock buybacks, so it is not as if they do buybacks because it is more tax efficient. They just decide to reward shareholders in both ways. REWARD shareholders, because buybacks are a way to reward shareholders that stay, despite reducing book value, and despite using the capital pool to buy shares from members who are leaving.
  • I am also not asking that we copy that they do, I am suggesting that we do, what we always did.
  • The comment (singular, one, I have only made one comment about it) that I have made about the mutual dissolving, was that if a large majority of the mutual token holders wants to remove their capital, the mutual should liquidate. And it is a common sense statement: The capital belong to the members of the mutual, and if they express a majority opinion on the mutual stopping operations, the mutual should do that. If there was a vote and 51% of the holders vote to close the mutual down, I certainly hope that nobody opposes that. Note that that does not express my opinion about whether I personally think, or not, that the mutual should dissolve.
  • I am definitely making the statement that the mutual cannot use the capital more efficiently in the business. And this is based on the fact that a) Active covers have rarely exceeded 200k ETH since August 2021. Currently 116k ETH. This should be underwritten with something like ~42k ETH, leaving an excess capital of ~110k ETH. b) Investment activities of the mutual so far have been a net negative for the mutual: More ETH has been lost than earned. Under these circumstances, the best use of the capital pool is returning it to the mutual members. How can I say that “true value” is higher than book value? It is obviously not right now, but if book value would be reduced, and the capital was used efficiently, capital returns would be positive, and “true” value (whatever “true value” means, I mean market price in the end) would be above book value. It is ok to think that the mutual can use capital efficiently, unfortunately the reality is what it is. Facts are not opinions, and nexus has never used the capital pool efficiently, so far.
  • You said that nexus right now is not worth significantly more than book value, and that “when we begin to grow our exposure and increase profitability, that true value should rise”. This is true, but the exposure seems to be going down, has been for some time indeed. And you know which is the other way to increase profitability? Reduce capital. So well done, you are in the right track there. What is a way to reduce capital? Give it to shareholders.
  • “once your system is in place”. “My” system is in place. Rei’s proposal would replace it. The rest of that sentence seem to imply that the current market price is based on the book value, but it isn’t. Again, facts are not opinions. wNXM price is based on what the market values it, and it is not book value, it is 50% of that. NXM price is based on what the bonding curve values the mutual. Book value is a parameter used by the bonding curve to calculate token price, but so is MCRfloor. And the result of the bonding curve with MCRfloor is not book value, it is ~40% more than that. The bonding curve would value NXM higher than now, with a smaller capital pool, if only MCR was calculated accurately and not kept artificially high.
  • You keep on focused on the mutual buying tokens above book value, apparently still convinced that this is bad by default, despite the points of buybacks being positive for shareholders, and the need to reduce capital to improve profitability, and I have to wonder if you know how the mutual used to work: buy tokens, and selling tokens, were always open, and both happened at market prices. In fact, redeeming NXM tokens was (and is) exposed to a 2.5% redemption fee. This redemption encourages users to wrap the NXM token and sell wNXM outside the Nexus platform. Which results in the capital pool not leaving the mutual, but the mutual member being able to leave at market price, above book value. Which is a win-win-win situation. In the end there would be an equilibrium where there may be NXM token redemptions, but also token sales which would happen at market price, not book value, would replace the capital lost in the capital pool, and wNXM and NXM would be trading at more or less the same level. This was the case until the price of NXM was kept artificially high, and wNXM decoupled from it.
  • My best guess is that you are only able to see the situation in which somebody buys wNXM tokens now, unwraps and sells NXM for ETH. Since that means buy below BV and selling above BV, there’ll be some profit for whoever does it, you assume that the mutual is on the losing end of that trade. And it is basically not understanding anything, or seeing the relation between cause and effect: wNXM has become so cheap because Nexus is so unprofitable and the capital pool is unused and locked. And getting rid of the excess capital will make it more profitable, which in turn will make the mutual more valuable. And the profit that this person will be having is not at the expense of the mutual, the person who is losing here is whomever bought NXM on the platform, got desperate, wrapped it and sold wNXM below book value. The mutual and their members will profit from this, despite the smaller size of the capital pool.
  • I would have a few other things to say, but I am going to stop here, I’ve written long enough. My bottom line is that I want to keep the existing system in place, removing artificial limits, so that the mutual can operate as defined in the white paper. The bonding curve explained there is adequate for the mutual and it would fit its mission of redeeming excess capital when it is abundant, and attracting capital when it is needed. All this by increasing or reducing the token price that the mutual offers. This would increase the token price even if book value per token decreases, it would allow anybody who wants to leave, to do it at market prices, and the mutual would be smaller but way more capital efficient and more profitable. All in all, a clear net positive for the mutual, the holders that want to leave and the ones that want to stay.
2 Likes

Thanks Rei.

A pros/cons list for the alternative proposal as I see it:

Pros:

  • Unlocks capital pool for member exits
  • Pulls price up towards book value (“BV”) and beyond
  • Closes wNXM/NXM gap, and therefore makes NXM price used in protocol for covers/claims market-consistent
  • The price of NXM/wNXM would trade relatively freely around a value that is determined by the mutual, based on its own financing needs and capital resources
  • The redemption fee would encourage members to exit by wrapping the token and selling it outside of the platform. Furthermore the redemption fee would be an additional income source for the mutual.
  • This proposal should be very easy to implement, and would keep most of the existing structure of the mutual in place.
  • The mutual would become much leaner, more capital efficient and more profitable

Cons:

  • Book value decreases
  • The size of the capital pool will be dependent on how many people want to redeem their tokens and at which level it will stabilize is uncertain. In any case it would remain a fraction of the current amount.

Honestly I cannot think of any more cons, but if anybody else can provide any reasonable ones, I would add them here.

1 Like

Not sure why you insist on being so consistently rude.

I’ve put a lot of time and effort into trying to have a good discussion here and come to the right decision.

I’m not actively trying to move the goal posts. Originally we were talking about today, right now for Nexus, where I said we should only buyback below book. Then you brought up Allianz where I’m saying they might be able to justify buying slightly above book. Then, I brought up Apple, where book value plays very little in their valuation and so they can justify a buyback far above that. Those are all different situations and have different answers.

I’m happy to have conversations with those with different opinions. It’s good for the mutual to consider all the options. But, I have no desire to continue to be insulted. I’m happy to continue discussing if you change your mind on talking with me, and can be polite.

3 Likes

Please take the time to be kind and respectful to other members you are speaking with. Everyone is just trying to acheive consensus and debate ideas.

If you are frustrated with another member, you are free to not engage with them or to simply not comment.

This discussion, like all discussions within the mutual, is about the free and open exchange of ideas. It’s not appropriate to insult other members.

Throughout this conversation, you have been aggressive and have repeatedly insulted @Dopeee. I’m giving you a warning, as your last comment was over the line. I’ve included it below.

This violated the Treat everyone with respect and the zero-tolerance policy on harrassment that we have for our community forums.

Instead of banning you here, I’m issuing a warning and asking you to limit your comments to the discussion of ideas and to refrain from demeaning comments or ad hominem attacks on other members.

Going forward, I’ll have no choice but to ban you if you cannot abide by these community guidelines.

2 Likes

This is also about hyping executive compensation via share buybacks…irrelevant to this discussion. Tokens are NOT equity, incentives are NOT the same as share markets…just let this line of thinking die…

3 Likes

Maybe it is time to start a snapshot vote between the proposal by the tokenomics group or the one keeping the bonding curve, in order to focus in one of the two solutions? This discussion has been posted for more than 14 days already.

1 Like

Indeed, it would be good to have an indication of community consensus and snapshot vote is a logical next step.

For completeness, I believe we should have the 4 following options:

  • Do nothing, everything is fine
  • Tokenomics Group proposal
  • Bonding Curve+ proposal
  • Back to drawing board

The natural deadline to have something concluded is for when engineers start work on this after v2 is fully rolled out, so suggest we aim to go to Snapshot next Wednesday.

Can then collectively deal with edge cases, parameterisation, etc., during the next stage of implementation.

Will coordinate with @BraveNewDeFi to get the snapshot proposal up and message through the various channels. @ReeseWickham will message you separately to discuss how you want to present the proposal keeping the Bonding Curve for the snapshot vote.

4 Likes

Hi Rei and Nexus community,
In order to present my proposal in time for the snapshot vote next week, I have created a document that describes it in detail, starting with some background, why the mutual ended up restricting redemptions, why the bonding curve was not the problem and why we should keep it, and a bit more color on how we could do exactly that in an easy yet configurable way. You can find the document in the following link:

(link updated since old one was inaccesible for whatever reason)

1 Like

@ReeseWickham

Just want to say I was hesitant to some of your suggestions and pushback against the tokenomics v2 outline, but very pleased with this. Much more measured and nuanced.

While I’d still lean towards to initial v2 proposal, I’d be onboard with this also if that’s what governance ultimately decides on (although it would obviously hinge on the parameters chosen).

Given that this should be quicker and easier to implement that the strawman tokenmonics and that resolving tokenomics issues should be top priority; it could even be possible to implement this as a tokenomics v1.5 while tokenomics v2 is being built out?

Assuming for a moment that governance and mutual member decide that @Rei’s v2 approach is a preferable end-state but that it would take 9-12 months to implement; we could implement this approach in the interim as outlined in your proposal and with TP = Max (Book Value, TP(MCR)) parameter.

That would alleviate some concerns about a Race-to-Exit dynamic (@Hugh’s Issue #3 in discord), while having Rei’s v2 implemented once ready would mean concerns around Relocking and Above Book Price Dampening (@Hugh’s Issues #1 and #2) would also be addressed.

1 Like

Hey @ReeseWickham,

Can you send me a 50-100 word TL;DR of your proposal that I can include in the Snapshot signaling vote text?

I’ll be putting the signaling vote on Snapshot on Wednesday, so if you could get your summary to me by thhen, I would appreciate it :slightly_smiling_face:

You can share here on the forum or DM me on Discord.

2 Likes

There you go, exactly 100 words :slight_smile:

“Regarding the NXM tokenomics re-design discussions, this proposal supports the idea that the optimal solution for Nexus Mutual’s tokenomics is the existing model, based on a bonding curve, as described in the white paper. The reason why redemptions have been disabled and other undesirable effects happened, can be traced back to a single parameter defining the minimum capital requirements. Removing this parameter would not impact mutual’s solvency or operations, it would solve redemption issues and it would make the mutual much more capital efficient and profitable. This proposal also explains how this change is technically feasible and easy to implement.”

2 Likes

this is an outstanding summary of the situation

3 Likes

Some thoughts on the proposals from me, ahead of the snapshot vote.

I’d prefer avoiding spending development resource on implementing an intermediate v1.5 solution before moving to a long-term one. While there are likely to be fewer explicit changes to the code, many things will not change in the timeline, e.g. audit slots.

I’ll refer to the proposal I put forward in my original post as “Ratcheting AMM” and keeping the (modified) bonding curve simply as “Bonding Curve”.

Generally, my take is that they can both achieve some of the desired outcomes:

  • Remove the MCR floor
  • Unlock capital for members to exit
  • Allow arbitrage with wNXM price
  • conduct an automated wNXM buyback below book value in a controlled fashion to start with (assuming we are below book value upon launch)

Where the differences are, and why I prefer the Ratcheting AMM approach in the long run:

Flexible liquidity & easier adjustments

The Ratcheting AMM is more flexible and simpler in setting parameters on how quickly we follow market price movements (or nudge them, if we want to) via liquidity provided to the pool.

On the Bonding Curve, the amount of liquidity that buyers/sellers have to ‘get through’ to reach a certain price on the protocol depends on additional parameters - Capital Pool and MCR, which will vary depending on different states of the mutual and I don’t believe always aligns with our collective goals when it comes to attracting and releasing capital.

Asset backing

The Ratcheting AMM by definition increases the asset-backing-per-token (“book value”) for long-term aligned members regardless of the number of buys/sells and different market conditions, cycles etc. It also then moves the price towards that value. The Bonding Curve can at times be dilutive of that value - this may be especially true in the early stages of implementation if we are still below book value and the mutual mints a lot of new NXM when the price-driving MCR element is set to be high/price is set low.

Note that under the Ratcheting AMM, people can still buy at low prices and sell at high prices through wNXM.

Bonding Curve Issues

In the long run, having a bonding curve (after the suggested transition period) that always sets a price above the book value when MCR%>100% encourages more users to exit directly from the mutual, especially during bear markets, and I believe is more likely to result in capital locks based on the cover-driven MCR limit.

In addition, the bonding curve is not particularly good at capturing capital without intervention due to the exponential MCR% element, as was demonstrated in summer 2020 and resulted in the mutual raising the MCR floor value to the current level.

Happy voting everyone!

2 Likes

Thank you, @Rei, for putting this proposal together.

As stated in Discord, we are on board with tokenomics modifications in line with the prioritization presented (see discord):

  1. Fix MCR first
  2. Identify/develop mechanisms for encouraging capital
  3. Buying/selling NXM

Of the presented solutions in the proposal, we prefer option A. However, we would like to emphasize that there needs to be clarity on:

  1. The anticipated budget and rollout plan. This is a new mechanism and it’s fairly complex. Ideally, we start with a small initial budget and increase over time rather than funding immediately with e.g. 10k ETH and potentially overlooking something. We’re not sure how this could be done in practice, but it warrants exploration to avoid a potentially expensive lesson.
  2. A contingency plan for the possibility that anticipated liquidity needs are exceeded. We can envision scenarios in which sell-pressure might still exceed the budget even when we allocate a large amount to it as mentioned above. Some possible causes for this sell pressure might include 1) the surge in price itself, 2) the anticipated decrease of book value (e.g. related to claims), and 3) the increase in risk if the active cover is not decreased in parallel (i.e. if the supply is decreased by -x%, the expected loss per NXM will increase by x/(1-x) %). It is not clear if we should increase the budget in those cases or stop the ratcheting altogether. Ideally, we understand how the risk profile (e.g. slashing insurance) and supply-profile (e.g. longer term staking) evolves in v2 as it likely directly impacts the potential sell-pressure. Overall, it is crucial to address these potential scenarios to safeguard both the protocol and long-term investors.
  3. A mechanism to avoid frontrunning/gamification. Frontrunning could be an inherent problem of the ratcheting system (incl. withdrawals on anticipated claims mentioned in 2). Mechanisms such as adding a fee and/or adjusting slippage should ideally be developed to reduce these kinds of activities.
  4. A replacement for the current MCR. Lifting the current MCR floor is reasonable, but it would be good to know plans/timelines on potential adjustments of the MCR-estimation in place then.

We understand that 1 and 2 above require additional data following the launch of v2, and that the current signaling vote is solely to help provide clarity on direction. However, we were hoping to have some additional clarity around these data points before voting and/or at least want to check that all of the points are part of parameters that we can decide on once this has been put in code. If any of the above points have been addressed elsewhere, we would love to take a look.

3 Likes

Hey @Justin-1kx,

Just wanted to note that the Snapshot signaling vote is only to guage sentiment on where the community would like to see development resources allocated post-V2 launch.

No changes can be made to the protocol without an all-member on-chain vote. In the Technical specifications section of the Snapshot proposal, I’ve highlighted what comes after this Snapshot vote:

The next stage is dependent on the outcome of this signaling vote. If Option A or Option B receive the majority vote, then the next stages will involve:

Technical development. The Engineering team would develop the smart contracts necessary to implement the chosen proposal. This would include development, testing, and audits of the design.

Determine parameters of the chosen design. Members will discuss and select the initial parameters for the chosen design and approve the implementation through an on-chain vote for the final protocol improvement proposal.

Members will need to discussion #1 after the Snapshot signaling vote closes, as well as the other points that you’ve noted. @Rei did include in the original post an estimate on timing in the original post above:

Below is a timeline from my perspective going forward, provided there is agreement on the mechanism. As always, way more important to be thorough and ensure code & mechanism is rock-solid rather than shipping something with holes in it.

Q1 Feedback on mechanism and parameter discussion. If no major red flags and nothing significant needs to be redesigned, engineering work begins to convert tokenomics spec to solidity code after v2 is live.

Q2 Code finalised, appropriate testing, audits.

Q3/Q4 Governance process, New mechanism live.

I’m looking forward to members discussing the parameters going forward once there’s clarity on which direction members signal support for.

2 Likes

Thanks for the comments. Discussing the below for the Ratcheting AMM design only. Generally, I believe points 1.-3. can indeed be addressed through parameterisation and smaller tweaks to the system.

1. Anticipated Budget & rollout - agree that getting this right is crucial and should be a big part of the launch process design. Ideally we’d run a small-scale experiment to see what happens as suggested.
Have to note that the work so far suggests setting the right amount of initial/ongoing liquidity is key to achieving desired outcomes, so might not be as simple as just going in with e.g. 10% of what we actually want to start with to test the waters, because the outcome would be entirely different. However, plenty to play around with here to give ourselves the most certainty of a good outcome.

2. Contingency Plan Yep, we need to plan for both higher and lower sell pressure, but I think the real concern here comes back to the system ensuring that the ability to enter and exit works well when operating at capital efficiency, i.e. capital can/wants to come in and can leave reasonably while ensuring liabilities are appropriately provided for.

In the long term, we should be operating stably, profitably and with the ability to grow while operating with a capital pool close to 100% of cover-driven MCR.

Whether that comes sooner or later depends largely on how quickly cover grows, and in what profile. The best outcome is that cover increases by 5x tomorrow and we run on cover-driven MCR at or near Capital Pool = MCR at current level. The other outcome is that covers don’t grow, but eventually we still will be running at Capital Pool = MCR.

To address the possible causes you mentioned:

  1. Price surge above book value implies all sell pressure is then routed through wNXM and the protocol doesn’t lose any capital. We’re only losing capital below book value. Liquidity provision is the key parameter to how quickly that can happen.
  2. Expecting large claims will indeed lower the book value, but frontrunning this in the short term can be mitigated by slippage, and in the long term book value should go up as a result of cover fees > claims.
  3. Unless I’m misunderstanding the point, this is around the Capital Pool = MCR consideration I mentioned above.

Talking through and getting comfortable with these scenarios and more will indeed be a key part of parameter setting and implementation.

3. Frontrunning. Agree - one of the aims is minimising scenarios where someone can turn a profit without contributing anything to the system. One of the benefits of the design is that the frontrunning can be mitigated by slippage, which then gets reversed by the ratchet, compared to the current system where sequences such as e.g. sell high → MCR% reduced by claims → buy back at lower price are much more predictable and easily gamed (although there hasn’t been much history of this happening in practice in advance of claims).

Once the direction is decided, look forward to the community coming together to put forward any unaddressed scenarios of concern so that they can be incorporated during development.

4. MCR As per the discord post you referenced, there is no intention to change the current on-chain implementation of Cover Amount * X.
Mainly this is due to technical reasons - posting actuarial calculations of the type
Capital Requirement (CR) = √∑𝑖,𝑗 𝐶𝑜𝑟𝑟(𝑖,𝑗) ∙ 𝐸𝑥𝑝(𝑖) ∙ 𝐸𝑥𝑝(𝑗)
where
𝐸𝑥𝑝(𝑖) - losses in extreme events, and
𝐶𝑜𝑟𝑟(𝑖,𝑗) – correlations
is prohibitive due to computational costs, and Cover Amount * X is a proxy that moves in the same direction as the exposure changes.

However, one of my other projects within the DAO R&D team is to establish a more consistent updating process between these calculations off-chain and the governing the X multiplier of the Cover Amount proxy that’s posted on-chain. Since most of this work will be done off-chain, it can be run in parallel to Tokenomics implementation work. Exact timelines TBD but I’m expecting to start working on the actuarial side at the end of March and have a process in place before the current DAO funding period ends at the beginning of August.

3 Likes

Following on from the snapshot vote concluding yesterday, wanted to briefly mention how I see the immediate next steps.

As per the snapshot vote ,and as mentioned by @BraveNewDefi above, to get to launch, we need to complete two parallel strands:

  • Technical development. The Engineering team would develop the smart contracts necessary to implement the chosen proposal. This would include writing the smart contracts, testing, and audits of the design.
  • Determine parameters of the chosen design. Members will discuss and select the initial parameters for the chosen design and approve the implementation through an on-chain vote for the final protocol improvement proposal.

The next milestone will be to create a specification document that can be handed to the engineering team - hoping to do this within a month.

Some aspects of fine-tuning the chosen design will influence the design spec, e.g.

  • any transition mechanisms that require their own code,
  • any edge cases that may force a design change, and/or
  • any technical limitations discovered by engineering team themselves

Other aspects will not, e.g. setting the numeric parameters of the system, like opening price, liquidity and ratchet speed.

Therefore, my initial focus will be on nailing down those aspects that will influence the design, so that what’s handed over to the engineers is as robust as possible.

Some examples of the sorts of things I mean:

  • Currency of redemptions. Currently everything is modelled/calculated in ETH, but the capital pool is denominated in more currencies, with stETH being chief among them and may be diversified into further investment assets in the future. Should we be enabling/setting withdrawal to be in multiple currencies according to the current split of the capital pool or should we set aside a pool of ETH cash specifically for withdrawals?
  • NXM price transition. Detaching the price from the current bonding curve level to a market-consistent level would create a capacity shock. Should we trend it down over time, if so, what’s the best approach?
  • Oracle safety: There is the possibility that Book Value as per the system is slightly out due to oracle errors / delays. Is this a concern in the range around book value and, if so, should we introduce some sort of buffer around Book Value?
  • TWAP for system price: how defined, what time frame, etc.

Meeting some of the foundation team next week to brainstorm the next level of detail on the points above and more, and intending to collate a list of suggestions and discussion points for the wider community + post them in a separate forum post, so that we can get a wide range of interested brainpower working on this.

In the meantime (ideally by end of Tuesday 24th Feb), I’d encourage everyone here to highlight anything that jumps out to you in the current design that gives you concern.

4 Likes