From 701591cf7de6057caa764b2e6a0afc2c7998adbc Mon Sep 17 00:00:00 2001 From: Mike MacCana Date: Mon, 30 Sep 2024 14:43:48 +1000 Subject: [PATCH] Remove smart quotes (they cause problems for some markdown parsers) --- .../verifiable-randomness-functions.md | 62 +++--- .../intro-to-solana/interact-with-wallets.md | 86 ++++---- .../intro-to-solana/intro-to-cryptography.md | 6 +- .../intro-to-custom-onchain-programs.md | 22 +- .../intro-to-solana/intro-to-reading-data.md | 6 +- .../intro-to-solana/intro-to-writing-data.md | 8 +- .../courses/mobile/intro-to-solana-mobile.md | 62 +++--- content/courses/mobile/mwa-deep-dive.md | 2 +- .../mobile/solana-mobile-dapps-with-expo.md | 2 +- .../cross-program-invocations.md | 28 +-- .../deserialize-custom-data-frontend.md | 44 ++-- .../deserialize-instruction-data.md | 10 +- ...paging-ordering-filtering-data-frontend.md | 76 +++---- .../program-security.md | 36 ++-- .../program-state-management.md | 2 +- .../serialize-instruction-data-frontend.md | 74 +++---- .../courses/onchain-development/anchor-cpi.md | 16 +- .../onchain-development/anchor-pdas.md | 34 +-- .../intro-to-anchor-frontend.md | 26 +-- .../onchain-development/intro-to-anchor.md | 26 +-- .../program-architecture.md | 86 ++++---- .../program-configuration.md | 4 +- .../program-optimization/rust-macros.md | 10 +- .../program-security/account-data-matching.md | 22 +- .../courses/program-security/arbitrary-cpi.md | 14 +- .../bump-seed-canonicalization.md | 10 +- .../program-security/closing-accounts.md | 12 +- .../duplicate-mutable-accounts.md | 8 +- .../courses/program-security/owner-checks.md | 2 +- .../courses/program-security/pda-sharing.md | 14 +- .../program-security/security-intro.md | 4 +- .../courses/program-security/signer-auth.md | 12 +- content/courses/solana-pay/solana-pay.md | 4 +- .../state-compression/compressed-nfts.md | 198 +++++++++--------- .../generalized-state-compression.md | 184 ++++++++-------- .../courses/token-extensions/close-mint.md | 2 +- .../token-extensions/default-account-state.md | 6 +- .../token-extensions/immutable-owner.md | 6 +- .../interest-bearing-token.md | 4 +- .../non-transferable-token.md | 2 +- .../token-extensions/permanent-delegate.md | 2 +- .../courses/token-extensions/required-memo.md | 4 +- .../courses/token-extensions/transfer-fee.md | 2 +- .../courses/tokens-and-nfts/token-program.md | 6 +- content/guides/advanced/stake-weighted-qos.md | 6 +- content/guides/games/hello-world.md | 14 +- content/guides/games/interact-with-tokens.md | 12 +- content/guides/games/store-sol-in-pda.md | 2 +- .../guides/getstarted/cosmwasm-to-solana.md | 4 +- content/guides/getstarted/intro-to-anchor.md | 2 +- .../getstarted/local-rust-hello-world.md | 4 +- content/guides/getstarted/rust-to-solana.md | 14 +- .../getstarted/scaffold-nextjs-anchor.md | 2 +- .../getstarted/solana-test-validator.md | 2 +- .../guides/javascript/get-program-accounts.md | 6 +- .../token-extensions/getting-started.md | 4 +- .../guides/token-extensions/transfer-hook.md | 4 +- .../add-solana-wallet-adapter-to-nextjs.md | 8 +- content/workshops/solana-101.md | 40 ++-- docs/advanced/confirmation.md | 82 ++++---- docs/advanced/retry.md | 28 +-- docs/economics/index.md | 2 +- docs/economics/inflation/terminology.md | 4 +- .../quick-start/cross-program-invocation.md | 2 +- docs/more/exchange.md | 4 +- docs/programs/testing.md | 12 +- 66 files changed, 752 insertions(+), 752 deletions(-) diff --git a/content/courses/connecting-to-offchain-data/verifiable-randomness-functions.md b/content/courses/connecting-to-offchain-data/verifiable-randomness-functions.md index 3f73823e1..3f4445c28 100644 --- a/content/courses/connecting-to-offchain-data/verifiable-randomness-functions.md +++ b/content/courses/connecting-to-offchain-data/verifiable-randomness-functions.md @@ -57,10 +57,10 @@ game as a seed. Unfortunately, neither type of randomness is natively available in Solana programs, because these programs have to be deterministic. All validators need -to come to the same conclusion. There is no way they’d all draw the same random -number, and if they used a seed, it’d be prone to attacks. See the +to come to the same conclusion. There is no way they'd all draw the same random +number, and if they used a seed, it'd be prone to attacks. See the [Solana FAQs](https://solana.com/docs/programs/lang-rust#depending-on-rand) for -more. So we’ll have to look outside of the blockchain for randomness with VRFs. +more. So we'll have to look outside of the blockchain for randomness with VRFs. ### What is Verifiable Randomness? @@ -68,7 +68,7 @@ A Verifiable Random Function (VRF) is a public-key pseudorandom function that provides proofs that its outputs were calculated correctly. This means we can use a cryptographic keypair to generate a random number with a proof, which can then be validated by anyone to ensure the value was calculated correctly without -the possibility of leaking the producer’s secret key. Once validated, the random +the possibility of leaking the producer's secret key. Once validated, the random value is stored onchain in an account. VRFs are a crucial component for achieving verifiable and unpredictable @@ -100,7 +100,7 @@ Switchboard is a decentralized Oracle network that offers VRFs on Solana. Oracles are services that provide external data to a blockchain, allowing them to interact with and respond to real-world events. The Switchboard network is made up of many different individual oracles run by third parties to provide -external data and service requests onchain. To learn more about Switchboard’s +external data and service requests onchain. To learn more about Switchboard's Oracle network, please refer to our [Oracle lesson](/content/courses/connecting-to-offchain-data/oracles.md) @@ -112,13 +112,13 @@ verified, the Switchboard program will execute a onchain callback defined by the VRF Account during account creation. From there the program can consume the random data. -You might be wondering how they get paid. In Switchboard’s VRF implementation, +You might be wondering how they get paid. In Switchboard's VRF implementation, you actually pay per request. ### Requesting and Consuming VRF Now that we know what a VRF is and how it fits into the Switchboard Oracle -network, let’s take a closer look at how to actually request and consume +network, let's take a closer look at how to actually request and consume randomness from a Solana program. At a high level, the process for requesting and consuming randomness from Switchboard looks like this: @@ -137,7 +137,7 @@ and consuming randomness from Switchboard looks like this: pseudorandom number returned from the Oracle. 7. Program consumes the random number and can execute business logic with it! -There are a lot of steps here, but don’t worry, we'll be going through each step +There are a lot of steps here, but don't worry, we'll be going through each step of the process in detail. First there are a couple of accounts that we will have to create ourselves to @@ -191,8 +191,8 @@ Some important fields on this account are `authority`, `oracle_queue`, and `callback`. The `authority` should be a PDA of the program that has the ability to request randomness on this `vrf` account. That way, only that program can provide the signature needed for the vrf request. The `oracle_queue` field -allows you to specify which specific oracle queue you’d like to service the vrf -requests made with this account. If you aren’t familiar with oracle queues on +allows you to specify which specific oracle queue you'd like to service the vrf +requests made with this account. If you aren't familiar with oracle queues on Switchboard, checkout the [Oracles lesson in the Connecting to Offchain Data course](/content/courses/connecting-to-offchain-data/oracles)! Lastly, the `callback` field is where you define the callback instruction the @@ -254,7 +254,7 @@ Now, you can create the `vrf` account. Now that we have all of our needed accounts we can finally call the `request_randomness` instruction on the Switchboard program. It's important to note you can invoke the `request_randomness` in a client or within a program -with a cross program invocation (CPI). Let’s take a look at what accounts are +with a cross program invocation (CPI). Let's take a look at what accounts are required for this request by checking out the Account struct definition in the actual [Switchboard program](https://github.com/switchboard-xyz/solana-sdk/blob/fbef37e4a78cbd8b8b6346fcb96af1e20204b861/rust/switchboard-solana/src/oracle_program/instructions/vrf_request_randomness.rs#L8). @@ -296,7 +296,7 @@ pub struct VrfRequestRandomness<'info> { } ``` -That’s a lot of accounts, let’s walk through each one and give them some +That's a lot of accounts, let's walk through each one and give them some context. - `authority` - PDA derived from our program @@ -320,7 +320,7 @@ context. [Recent Blockhashes Solana program](https://docs.rs/solana-program/latest/solana_program/sysvar/recent_blockhashes/index.html) - Token Program - Solana Token Program -That’s all the accounts needed for just the randomness request, now let's see +That's all the accounts needed for just the randomness request, now let's see what it looks like in a Solana program via CPI. To do this, we make use of the `VrfRequestRandomness` data struct from the [SwitchboardV2 rust crate.](https://github.com/switchboard-xyz/solana-sdk/blob/main/rust/switchboard-solana/src/oracle_program/instructions/vrf_request_randomness.rs) @@ -367,7 +367,7 @@ Ok(()) ``` Once the Switchboard program is invoked, it does some logic on its end and -assigns an oracle in the `vrf` account’s defined oracle queue to serve the +assigns an oracle in the `vrf` account's defined oracle queue to serve the randomness request. The assigned oracle then calculates a random value and sends it back to the Switchboard program. @@ -415,7 +415,7 @@ pub fn handler(ctx: Context) -> Result <()> { ``` Now you have randomness! Hooray! But there is one last thing we have not talked -about yet and that’s how the randomness is returned. Switchboard, gives you your +about yet and that's how the randomness is returned. Switchboard, gives you your randomness calling `[get_result()](https://github.com/switchboard-xyz/solana-sdk/blob/9dc3df8a5abe261e23d46d14f9e80a7032bb346c/rust/switchboard-solana/src/oracle_program/accounts/vrf.rs#L122)`. This method returns the `current_round.result` field of the `vrf` account @@ -440,11 +440,11 @@ the steps involved in a VRF request, review this diagram. ## Lab -For this lesson’s lab, we will be picking up where we left off in the +For this lesson's lab, we will be picking up where we left off in the [Oracle lesson](/content/courses/connecting-to-offchain-data/oracles). If you haven't completed the Oracle lesson and demo, we strongly recommend you do as -there are a lot of overlapping concepts and we’ll be starting from the Oracle -lesson’s codebase. +there are a lot of overlapping concepts and we'll be starting from the Oracle +lesson's codebase. If you don't want to complete the Oracle lesson, the starter code for this lab is provided for you in @@ -475,8 +475,8 @@ following: 6. `yarn install` 7. `anchor test` -When all tests pass we’re ready to begin. We will start by filling in some -boilerplate stuff, then we’ll implement the functions. +When all tests pass we're ready to begin. We will start by filling in some +boilerplate stuff, then we'll implement the functions. #### 2. Cargo.toml @@ -681,7 +681,7 @@ pub mod consume_randomness; Lastly, let's update our `deposit.rs` and `withdraw.rs` files to reflect our soon-to-be new powers. -First, let’s initialize our `out_of_jail` flag to `false` in `deposit.rs`. +First, let's initialize our `out_of_jail` flag to `false` in `deposit.rs`. ```rust // in deposit.rs @@ -720,8 +720,8 @@ check, going straight to our withdrawal. #### 8. Using VRF -Now that we have the boilerplate out of the way, let’s move on to our first -addition: initializing our VRF Client. Let’s create a new file called +Now that we have the boilerplate out of the way, let's move on to our first +addition: initializing our VRF Client. Let's create a new file called `init_vrf_client.rs` in the `/instructions` folder. We'll add the needed crates, then create the `InitVrfClient` context. We'll need @@ -731,7 +731,7 @@ the following accounts: - `escrow_account` - the burry escrow account created when the user locked their funds up. - `vrf_client_state` - account we will be creating in this instruction to hold - state about the user’s dice rolls. + state about the user's dice rolls. - `vrf` - Our VRF owned by the Switchboard program, we will create this account client-side before we call `init_vrf_client`. - `system_program` - The system program since we use the init macro for @@ -786,7 +786,7 @@ only have one `escrow_account`. Since there is only one, If you wanted to be thorough, you might want to implement a `close_vrf_state` function to get your rent back. -Now, let’s write some basic initialization logic for this function. First we +Now, let's write some basic initialization logic for this function. First we load and initialize our `vrf_state` account by calling `load_init()`. Then we fill in the values for each field. @@ -926,7 +926,7 @@ pub struct RequestRandomness<'info> { } ``` -Lastly, we'll create a new struct `RequestRandomnessParams`. We’ll be passing in +Lastly, we'll create a new struct `RequestRandomnessParams`. We'll be passing in some account's bumps client-side. ```rust @@ -1006,7 +1006,7 @@ If doubles are rolled, set the `out_of_jail` field on `vrf_state` to true. First, let's create the `ConsumeRandomness` context. Fortunately, it only takes three accounts. -- `escrow_account` - state account for user’s escrowed funds. +- `escrow_account` - state account for user's escrowed funds. - `vrf_state` - state account to hold information about dice roll. - `vrf` - account with the random number that was just calculated by the Switchboard network. @@ -1092,7 +1092,7 @@ pub fn consume_randomness_handler(ctx: Context) -> Result <() } ``` -Now it’s time to actually use the random result. Since we only use two dice we +Now it's time to actually use the random result. Since we only use two dice we only need the first two bytes of the buffer. To convert these random values into “dice rolls”, we use modular arithmetic. For anyone not familiar with modular arithmetic, @@ -1201,12 +1201,12 @@ Please make sure your program builds successfully by running `anchor build`. #### 11. Testing -Alright, let’s test our program. Historically, we'd need to test the VRF on +Alright, let's test our program. Historically, we'd need to test the VRF on Devnet. Fortunately, the folks at Switchboard have created some really nice -functions to let us run our own VRF oracle locally. For this, we’ll need to set +functions to let us run our own VRF oracle locally. For this, we'll need to set up our local server, grab all of the right accounts, and then call our program. -The first thing we’ll do is pull in some more accounts in our `Anchor.toml` +The first thing we'll do is pull in some more accounts in our `Anchor.toml` file: ```toml diff --git a/content/courses/intro-to-solana/interact-with-wallets.md b/content/courses/intro-to-solana/interact-with-wallets.md index 924a5ea11..27738df9f 100644 --- a/content/courses/intro-to-solana/interact-with-wallets.md +++ b/content/courses/intro-to-solana/interact-with-wallets.md @@ -16,8 +16,8 @@ description: "Connect with installed browser wallets from your React apps." software wallets are often **browser extensions** that add the ability to connect to a wallet from a website. On mobile, wallet apps have their own browsers. -- Solana’s **Wallet Adapter** allows you to build websites that can request a - user’s wallet address and propose transactions for them to sign +- Solana's **Wallet Adapter** allows you to build websites that can request a + user's wallet address and propose transactions for them to sign ## Lesson @@ -26,7 +26,7 @@ description: "Connect with installed browser wallets from your React apps." In the previous two lessons, we discussed keypairs. Keypairs are used to locate accounts and sign transactions. While the public key of a keypair is perfectly safe to share, the secret key should always be kept in a secure location. If a -user’s secret key is exposed, then a malicious actor could execute transactions +user's secret key is exposed, then a malicious actor could execute transactions with the authority of that user, allowing them to transfer all the assets inside. @@ -42,7 +42,7 @@ existing device(s). Both techniques allow websites to interact easily with the wallet, for example: -1. Seeing the wallet’s wallet address (their public key) +1. Seeing the wallet's wallet address (their public key) 2. Submitting transactions for a user's approval to sign 3. Sending signed transactions to the network @@ -51,20 +51,20 @@ transaction to your wallet and having the wallet handle the signing, you ensure that you never expose your secret key to the website. Instead, you only share the secret key with the wallet application. -Unless you’re creating a wallet application yourself, your code should never +Unless you're creating a wallet application yourself, your code should never need to ask a user for their secret key. Instead, you can ask users to connect to your site using a reputable wallet. -## Solana’s Wallet Adapter +## Solana's Wallet Adapter If you build web apps, and need users to be able to connect to their wallets and -sign transactions through your apps, you'll want Solana’s Wallet Adapter. Wallet +sign transactions through your apps, you'll want Solana's Wallet Adapter. Wallet Adapter is a suite of modular packages: - The core functionality is found in `@solana/wallet-adapter-base`. - React support is added by `@solana/wallet-adapter-react`. - Additional packages provide components for common UI frameworks. In this - lesson, and throughout this course, we’ll be using components from + lesson, and throughout this course, we'll be using components from `@solana/wallet-adapter-react-ui`. Finally, some packages are adapters for specific wallet apps. These are now no @@ -73,7 +73,7 @@ longer necessary in most cases - see below. ### Install Wallet-Adapter Libraries for React When adding wallet support to an existing React app, you start by installing the -appropriate packages. You’ll need `@solana/wallet-adapter-base`, +appropriate packages. You'll need `@solana/wallet-adapter-base`, `@solana/wallet-adapter-react`. If you plan to use the provided React components, you'll also need to add `@solana/wallet-adapter-react-ui`. @@ -135,7 +135,7 @@ export const Home: NextPage = props => { ``` Note that `ConnectionProvider` requires an `endpoint` property and that -`WalletProvider` requires a `wallets` property. We’re continuing to use the +`WalletProvider` requires a `wallets` property. We're continuing to use the endpoint for the Devnet cluster, and since all major Solana wallet applications support the Wallet Standard, we don't need any wallet-specific adapters. At this point, you can connect with `wallet.connect()`, which will instruct the wallet @@ -144,9 +144,9 @@ for transactions. ![wallet connection prompt](/public/assets/courses/unboxed/wallet-connect-prompt.png) -While you could do this in a `useEffect` hook, you’ll usually want to provide +While you could do this in a `useEffect` hook, you'll usually want to provide more sophisticated functionality. For example, you may want users to be able to -choose from a list of supported wallet applications or disconnect after they’ve +choose from a list of supported wallet applications or disconnect after they've already connected. ### @solana/wallet-adapter-react-ui @@ -194,7 +194,7 @@ export default Home; ``` The `WalletModalProvider` adds functionality for presenting a modal screen for -users to select which wallet they’d like to use. The `WalletMultiButton` changes +users to select which wallet they'd like to use. The `WalletMultiButton` changes behavior to match the connection status: ![multi button select wallet option](/public/assets/courses/unboxed/multi-button-select-wallet.png) @@ -219,7 +219,7 @@ functionality: Once your site is connected to a wallet, `useConnection` will retrieve a `Connection` object and `useWallet` will get the `WalletContextState`. `WalletContextState` has a property `publicKey` that is `null` when not -connected to a wallet and has the public key of the user’s account when a wallet +connected to a wallet and has the public key of the user's account when a wallet is connected. With a public key and a connection, you can fetch account info and more. @@ -312,14 +312,14 @@ const sendSol = async event => { ``` When this function is called, the connected wallet will display the transaction -for the user’s approval. If approved, then the transaction will be sent. +for the user's approval. If approved, then the transaction will be sent. ![wallet transaction approval prompt](/public/assets/courses/unboxed/wallet-transaction-approval-prompt.png) ## Lab -Let’s take the Ping program from the last lesson and build a frontend that lets -users approve a transaction that pings the program. As a reminder, the program’s +Let's take the Ping program from the last lesson and build a frontend that lets +users approve a transaction that pings the program. As a reminder, the program's public key is `ChT1B39WKLS8qUrkLvFDXMhEJ4F1XZzwUNHUt4AU9aVa` and the public key for the data account is `Ah9K7dQ8EHaZqcAsgBW8w37yN2eAy3koFmUn4x3CJtod`. @@ -341,25 +341,25 @@ Then set your wallet to use Devnet, for example: - In Solflare, click **Settings** -> **General** -> **Network** -> **DevNet** - In Backpack, click **Preferences** -> **Developer Mode** -This ensures that your wallet app will be connected to the same network we’ll be +This ensures that your wallet app will be connected to the same network we'll be using in this lab. ### Download the starter code Download the [starter code for this project](https://github.com/Unboxed-Software/solana-ping-frontend/tree/starter). -This project is a simple Next.js application. It’s mostly empty except for the -`AppBar` component. We’ll build the rest throughout this lab. +This project is a simple Next.js application. It's mostly empty except for the +`AppBar` component. We'll build the rest throughout this lab. You can see its current state with the command `npm run dev` in the console. ### Wrap the app in context providers -To start, we’re going to create a new component to contain the various -Wallet-Adapter providers that we’ll be using. Create a new file inside the +To start, we're going to create a new component to contain the various +Wallet-Adapter providers that we'll be using. Create a new file inside the `components` folder called `WalletContextProvider.tsx`. -Let’s start with some of the boilerplate for a functional component: +Let's start with some of the boilerplate for a functional component: ```tsx import { FC, ReactNode } from "react"; @@ -373,7 +373,7 @@ const WalletContextProvider: FC<{ children: ReactNode }> = ({ children }) => { export default WalletContextProvider; ``` -To properly connect to the user’s wallet, we’ll need a `ConnectionProvider`, +To properly connect to the user's wallet, we'll need a `ConnectionProvider`, `WalletProvider`, and `WalletModalProvider`. Start by importing these components from `@solana/wallet-adapter-react` and `@solana/wallet-adapter-react-ui`. Then add them to the `WalletContextProvider` component. Note that @@ -405,9 +405,9 @@ export default WalletContextProvider; The last things we need are an actual endpoint for `ConnectionProvider` and the supported wallets for `WalletProvider`. -For the endpoint, we’ll use the same `clusterApiUrl` function from the -`@solana/web3.js` library that we’ve used before so you’ll need to import it. -For the array of wallets you’ll also need to import the +For the endpoint, we'll use the same `clusterApiUrl` function from the +`@solana/web3.js` library that we've used before so you'll need to import it. +For the array of wallets you'll also need to import the `@solana/wallet-adapter-wallets` library. After importing these libraries, create a constant `endpoint` that uses the @@ -450,10 +450,10 @@ export default WalletContextProvider; ### Add wallet multi-button -Next, let’s set up the Connect button. The current button is just a placeholder +Next, let's set up the Connect button. The current button is just a placeholder because rather than using a standard button or creating a custom component, -we’ll be using Wallet-Adapter’s “multi-button.” This button interfaces with the -providers we set up in `WalletContextProvider` and let’s users choose a wallet, +we'll be using Wallet-Adapter's “multi-button.” This button interfaces with the +providers we set up in `WalletContextProvider` and let's users choose a wallet, connect to a wallet, and disconnect from a wallet. If you ever need more custom functionality, you can create a custom component to handle this. @@ -492,7 +492,7 @@ export default Home; If you run the app, everything should still look the same since the current button on the top right is still just a placeholder. To remedy this, open `AppBar.tsx` and replace `` with ``. -You’ll need to import `WalletMultiButton` from +You'll need to import `WalletMultiButton` from `@solana/wallet-adapter-react-ui`. ```tsx @@ -519,17 +519,17 @@ button to connect your wallet to the site. ### Create button to ping program -Now that our app can connect to our wallet, let’s make the “Ping!” button +Now that our app can connect to our wallet, let's make the “Ping!” button actually do something. -Start by opening the `PingButton.tsx` file. We’re going to replace the +Start by opening the `PingButton.tsx` file. We're going to replace the `console.log` inside of `onClick` with code that will create a transaction and -submit it to the wallet app for the end user’s approval. +submit it to the wallet app for the end user's approval. -First, we need a connection, the wallet’s public key, and Wallet-Adapter’s +First, we need a connection, the wallet's public key, and Wallet-Adapter's `sendTransaction` function. To get this, we need to import `useConnection` and -`useWallet` from `@solana/wallet-adapter-react`. While we’re here, let’s also -import `@solana/web3.js` since we’ll need it to create our transaction. +`useWallet` from `@solana/wallet-adapter-react`. While we're here, let's also +import `@solana/web3.js` since we'll need it to create our transaction. ```tsx import { useConnection, useWallet } from "@solana/wallet-adapter-react"; @@ -588,7 +588,7 @@ export const PingButton: FC = () => { With that, we can fill in the body of `onClick`. First, check that both `connection` and `publicKey` exist (if either does not -then the user’s wallet isn’t connected yet). +then the user's wallet isn't connected yet). Next, construct two instances of `PublicKey`, one for the program ID `ChT1B39WKLS8qUrkLvFDXMhEJ4F1XZzwUNHUt4AU9aVa` and one for the data account @@ -633,17 +633,17 @@ const onClick = async () => { }; ``` -And that’s it! If you refresh the page, connect your wallet, and click the ping +And that's it! If you refresh the page, connect your wallet, and click the ping button, your wallet should present you with a popup to confirm the transaction. ### Add some polish -There’s a lot you could do to make the user experience here even better. For +There's a lot you could do to make the user experience here even better. For example, you could change the UI to only show you the Ping button when a wallet is connected and display some other prompt otherwise. You could link to the transaction on Solana Explorer after a user confirms a transaction so they can easily go look at the transaction details. The more you experiment with it, the -more comfortable you’ll get, so get creative! +more comfortable you'll get, so get creative! You can also download the [full source code from this lab](https://github.com/Unboxed-Software/solana-ping-frontend) @@ -651,7 +651,7 @@ to understand all of this in context. ## Challenge -Now it’s your turn to build something independently. Create an application that +Now it's your turn to build something independently. Create an application that lets a user connect their wallet and send SOL to another account. ![Send SOL App](/public/assets/courses/unboxed/solana-send-sol-app.png) @@ -659,7 +659,7 @@ lets a user connect their wallet and send SOL to another account. 1. You can build this from scratch or you can [download the starter code](https://github.com/Unboxed-Software/solana-send-sol-frontend/tree/starter). 2. Wrap the starter application in the appropriate context providers. -3. In the form component, set up the transaction and send it to the user’s +3. In the form component, set up the transaction and send it to the user's wallet for approval. 4. Get creative with the user experience. Add a link to let the user view the transaction on Solana Explorer or something else that seems cool to you! diff --git a/content/courses/intro-to-solana/intro-to-cryptography.md b/content/courses/intro-to-solana/intro-to-cryptography.md index 795850f41..6f1c3d83b 100644 --- a/content/courses/intro-to-solana/intro-to-cryptography.md +++ b/content/courses/intro-to-solana/intro-to-cryptography.md @@ -102,7 +102,7 @@ to install `@solana/web3.js` npm i @solana/web3.js ``` -We’ll cover a lot of +We'll cover a lot of [web3.js](https://solana.com/docs/clients/javascript-reference) gradually throughout this course, but you can also check out the [official web3.js documentation](https://solana.com/docs/clients/javascript-reference). @@ -135,7 +135,7 @@ store secret keys in source code. Instead, we: ### Loading an existing keypair -If you already have a keypair you’d like to use, you can load a `Keypair` from +If you already have a keypair you'd like to use, you can load a `Keypair` from an existing secret key stored in the filesystem or an `.env` file. In node.js, the `@solana-developers/helpers` npm package includes some extra functions: @@ -153,7 +153,7 @@ import { getKeypairFromEnvironment } from "@solana-developers/helpers"; const keypair = getKeypairFromEnvironment("SECRET_KEY"); ``` -You know how to make and load keypairs! Let’s practice what we’ve learned. +You know how to make and load keypairs! Let's practice what we've learned. ## Lab diff --git a/content/courses/intro-to-solana/intro-to-custom-onchain-programs.md b/content/courses/intro-to-solana/intro-to-custom-onchain-programs.md index c939d6c6b..c0b0b3400 100644 --- a/content/courses/intro-to-solana/intro-to-custom-onchain-programs.md +++ b/content/courses/intro-to-solana/intro-to-custom-onchain-programs.md @@ -27,7 +27,7 @@ In previous chapters, we used: `@metaplex-foundation/mpl-token-metadata@2` to make instructions to Metaplex to create token Metadata. -When working with other programs, however, you’ll need to create instructions +When working with other programs, however, you'll need to create instructions manually. With `@solana/web3.js`, you can create instructions with the `TransactionInstruction` constructor: @@ -47,7 +47,7 @@ const instruction = new TransactionInstruction({ `TransactionInstruction()` takes 3 fields: -- The `programId` field is fairly self-explanatory: it’s the public key (also +- The `programId` field is fairly self-explanatory: it's the public key (also called the 'address' or 'program ID') of the program. - `keys` is an array of accounts and how they will be used during the @@ -60,7 +60,7 @@ const instruction = new TransactionInstruction({ - `isWritable` - a boolean representing whether or not the account is written to during the transaction's execution -- an optional `Buffer` containing data to pass to the program. We’ll be ignoring +- an optional `Buffer` containing data to pass to the program. We'll be ignoring the `data` field for now, but we will revisit it in a future lesson. After making our instruction, we add it to a transaction, send the transaction @@ -99,7 +99,7 @@ for that signature in Solana Explorer, then see: ### Writing transactions for the ping counter program -We’re going to create a script to ping an onchain program that increments a +We're going to create a script to ping an onchain program that increments a counter each time it has been pinged. This program exists on the Solana Devnet at address `ChT1B39WKLS8qUrkLvFDXMhEJ4F1XZzwUNHUt4AU9aVa`. The program stores its data in a specific account at the address @@ -147,7 +147,7 @@ Now let's talk to the Ping program! To do this, we need to: Remember, the most challenging piece here is including the right information in the instructions. We know the address of the program that we are calling. We also know that the program writes data to a separate account whose address we -also have. Let’s add the string versions of both of those as constants at the +also have. Let's add the string versions of both of those as constants at the top of the file: ```typescript @@ -159,7 +159,7 @@ const PING_PROGRAM_DATA_ADDRESS = new web3.PublicKey( ); ``` -Now let’s create a new transaction, then initialize a `PublicKey` for the +Now let's create a new transaction, then initialize a `PublicKey` for the program account, and another for the data account. ```typescript @@ -168,7 +168,7 @@ const programId = new web3.PublicKey(PING_PROGRAM_ADDRESS); const pingProgramDataId = new web3.PublicKey(PING_PROGRAM_DATA_ADDRESS); ``` -Next, let’s create the instruction. Remember, the instruction needs to include +Next, let's create the instruction. Remember, the instruction needs to include the public key for the Ping program and it also needs to include an array with all the accounts that will be read from or written to. In this example program, only the data account referenced above is needed. @@ -190,9 +190,9 @@ const instruction = new web3.TransactionInstruction({ }); ``` -Next, let’s add this instruction to the transaction we created. Then, call +Next, let's add this instruction to the transaction we created. Then, call `sendAndConfirmTransaction()` by passing in the connection, transaction, and -payer. Finally, let’s log the result of that function call so we can look it up +payer. Finally, let's log the result of that function call so we can look it up on Solana Explorer. ```typescript @@ -251,10 +251,10 @@ console.log( ); ``` -And just like that you’re calling programs on the Solana network and writing +And just like that you're calling programs on the Solana network and writing data onchain! -In the next few lessons, you’ll learn how to +In the next few lessons, you'll learn how to 1. Send transactions safely from the browser instead of running a script 2. Add custom data to your instructions diff --git a/content/courses/intro-to-solana/intro-to-reading-data.md b/content/courses/intro-to-solana/intro-to-reading-data.md index 9ea60e35c..11c129b67 100644 --- a/content/courses/intro-to-solana/intro-to-reading-data.md +++ b/content/courses/intro-to-solana/intro-to-reading-data.md @@ -10,9 +10,9 @@ description: ## Summary -- **SOL** is the name of Solana’s native token. Each SOL is made from 1 billion +- **SOL** is the name of Solana's native token. Each SOL is made from 1 billion **Lamports**. -- **Accounts** store tokens, NFTs, programs, and data. For now, we’ll focus on +- **Accounts** store tokens, NFTs, programs, and data. For now, we'll focus on accounts that store SOL. - **Addresses** point to accounts on the Solana network. Anyone can read the data at a given address. Most addresses are also **public keys**. @@ -127,7 +127,7 @@ The balance of the account at CenYq6bDRB7p73EjsPEpiYN7uveyPUTdXkDkgUduboaN is 0. ## Lab -Let’s practice what we’ve learned, and check the balance at a particular +Let's practice what we've learned, and check the balance at a particular address. ### Load a keypair diff --git a/content/courses/intro-to-solana/intro-to-writing-data.md b/content/courses/intro-to-solana/intro-to-writing-data.md index 5e2999a12..deede360e 100644 --- a/content/courses/intro-to-solana/intro-to-writing-data.md +++ b/content/courses/intro-to-solana/intro-to-writing-data.md @@ -120,8 +120,8 @@ dropped with an error like: > Transaction simulation failed: Attempt to debit an account but found no record of a prior credit. ``` -If you get this error, it’s because your keypair is brand new and doesn’t have -any SOL to cover the transaction fees. Let’s fix this by adding the following +If you get this error, it's because your keypair is brand new and doesn't have +any SOL to cover the transaction fees. Let's fix this by adding the following lines just after we've set up the connection: ```typescript @@ -134,7 +134,7 @@ await airdropIfRequired( ``` This will deposit 1 SOL into your account which you can use for testing. This -won’t work on Mainnet where it would have value. But it's incredibly convenient +won't work on Mainnet where it would have value. But it's incredibly convenient for testing locally and on Devnet. You can also use the Solana CLI command `solana airdrop 1` to get free test SOL @@ -158,7 +158,7 @@ for that signature in the Solana Explorer, then see: ## Lab -We’re going to create a script to send SOL to other students. +We're going to create a script to send SOL to other students. ### Basic scaffolding diff --git a/content/courses/mobile/intro-to-solana-mobile.md b/content/courses/mobile/intro-to-solana-mobile.md index 956e139b1..cf6a9002b 100644 --- a/content/courses/mobile/intro-to-solana-mobile.md +++ b/content/courses/mobile/intro-to-solana-mobile.md @@ -54,7 +54,7 @@ you hold your own keys. **Mobile Gaming with Solana Micropayments** -Mobile games account for roughly 50% of the video game industry’s total value, +Mobile games account for roughly 50% of the video game industry's total value, largely due to small in-game purchases. However, payment processing fees usually mean these in-game purchases have a minimum of $0.99 USD. With Solana, it's possible to unlock true micropayments. Need an extra life? That'll be 0.0001 @@ -66,7 +66,7 @@ SMS can enable a new wave of mobile e-commerce shoppers to pay directly from their favorite Solana wallet. Imagine a world where you can use your Solana wallet as seamlessly as you can use Apple Pay. -To summarize, mobile crypto opens up many doors. Let’s dive in and learn how we +To summarize, mobile crypto opens up many doors. Let's dive in and learn how we can be part of it: #### How Solana development differs between native mobile apps and web @@ -106,7 +106,7 @@ is pushed to the background. This kills the MWA WebSocket connection. This is an inherent design difference between iOS and Android (probably made to preserve battery, network usage, etc). -However, this doesn’t mean that Solana dApps can’t run on iOS at all. You can +However, this doesn't mean that Solana dApps can't run on iOS at all. You can still create a Mobile Web App using the [standard wallet adapter](https://github.com/solana-labs/wallet-adapter) library. Your users can then install a mobile-friendly wallet like @@ -197,7 +197,7 @@ about soon. Reading data from a Solana cluster in React Native is the exact same as in React. You use the `useConnection` hook to grab the `Connection` object. Using -that, you can get account info. Since reading is free, we don’t need to actually +that, you can get account info. Since reading is free, we don't need to actually connect to the wallet. ```tsx @@ -279,7 +279,7 @@ const sendTransactions = (transaction: Transaction) => { #### Debugging Since two applications are involved in sending transactions, debugging can be -tricky. Specifically, you won’t be able to see the wallet's debug logs the way +tricky. Specifically, you won't be able to see the wallet's debug logs the way you can see your dApps logs. Fortunately, @@ -288,7 +288,7 @@ makes it possible to see logs from all applications on your device. If you prefer not to use Logcat, the other method you could try is to only use the wallet to sign transactions, and then send them in your code. This allows -you to better debug the transaction if you’re running into problems. +you to better debug the transaction if you're running into problems. #### Releasing @@ -300,8 +300,8 @@ First, most of the mobile app marketplaces have policies restricting blockchain involvement. Crypto is new enough that it's a regulatory wildcard. Platforms feel they're protecting users by being strict with blockchain-related apps. -Second, if you use crypto for "purchases" in-app, you’ll be seen as -circumnavigating the platform’s fee (anywhere from 15-30%). This is explicitly +Second, if you use crypto for "purchases" in-app, you'll be seen as +circumnavigating the platform's fee (anywhere from 15-30%). This is explicitly against app store policies as the platform is trying to protect its revenue stream. @@ -345,8 +345,8 @@ with React Native. The app will interact with the Anchor counter program that we made in the [Intro to client-side Anchor development](https://www.soldev.app/course/intro-to-anchor-frontend) lesson. This dApp simply displays a counter and allows users to increment the -count through a Solana program. In this app, we’ll be able to see the current -count, connect our wallet, and increment the count. We’ll be doing this all on +count through a Solana program. In this app, we'll be able to see the current +count, connect our wallet, and increment the count. We'll be doing this all on Devnet and will be compiling only for Android. This program already exists and is already deployed on Devnet. Feel free to @@ -354,7 +354,7 @@ check out the [deployed program's code](https://github.com/Unboxed-Software/anchor-ping-frontend/tree/solution-decrement) if you want more context. -We’ll write this application in vanilla React Native without a starting +We'll write this application in vanilla React Native without a starting template. Solana Mobile provides a [React Native template](https://docs.solanamobile.com/react-native/react-native-scaffold) that shortcuts some of the boilerplate, but there's no better way to learn than @@ -410,16 +410,16 @@ few prerequisite setup items: ![Fake Wallet](/public/assets/courses/unboxed/basic-solana-mobile-fake-wallet.png) - 4. For debugging, you’ll want to use `Logcat`. Now that your fake wallet is + 4. For debugging, you'll want to use `Logcat`. Now that your fake wallet is running on the emulator, go to `View -> Tool Windows -> Logcat`. This will - open up a console logging out what’s happening with fake wallet. + open up a console logging out what's happening with fake wallet. 3. (Optional) Install other [Solana wallets](https://solana.com/ecosystem/explore?categories=wallet) on the Google Play store. -Lastly, if you run into Java versioning issues - you’ll want to be on Java -version 11. To check what you’re currently running type `java --version` in your +Lastly, if you run into Java versioning issues - you'll want to be on Java +version 11. To check what you're currently running type `java --version` in your terminal. #### 1. Plan out the App's Structure @@ -439,7 +439,7 @@ files we'll be creating and working with. #### 2. Create the App -Now that we've got some of the basic setup and structure down, let’s scaffold a +Now that we've got some of the basic setup and structure down, let's scaffold a new app with the following command: ```bash @@ -457,12 +457,12 @@ npm run android ``` This should open and run the app in your Android emulator. If you run into -problems, check to make sure you’ve accomplished everything in the +problems, check to make sure you've accomplished everything in the [prerequisites section](#0-prerequisites). #### 3. Install Dependencies -We’ll need to add in our Solana dependencies. +We'll need to add in our Solana dependencies. [The Solana Mobile docs provide a nice list of packages](https://docs.solanamobile.com/react-native/setup) and explanations for why we need them: @@ -484,8 +484,8 @@ In addition to this list, we'll add two more packages: - `assert`: A polyfill that lets Anchor do its thing. - `text-encoding-polyfill`: A polyfill needed to create the `Program` object -If you’re not familiar: polyfills actively replace Node-native libraries to make -them work anywhere Node is not running. We’ll finish our polyfill setup shortly. +If you're not familiar: polyfills actively replace Node-native libraries to make +them work anywhere Node is not running. We'll finish our polyfill setup shortly. For now, install dependencies using the following command: ```bash @@ -502,7 +502,7 @@ npm install \ #### 4. Create ConnectionProvider.tsx -Let’s start adding our Solana functionality. Create a new folder called +Let's start adding our Solana functionality. Create a new folder called `components` and within it, a file called `ConnectionProvider.tsx`. This provider will wrap the entire application and make our `Connection` object available throughout. Hopefully, you're noticing a pattern: this is identical to @@ -550,9 +550,9 @@ export const useConnection = (): ConnectionContextState => #### 5. Create AuthProvider.tsx -The next Solana provision we’ll need is the auth provider. This is one of the -main differences between mobile and web development. What we’re implementing -here is roughly equivalent to the `WalletProvider` that we’re used to in web +The next Solana provision we'll need is the auth provider. This is one of the +main differences between mobile and web development. What we're implementing +here is roughly equivalent to the `WalletProvider` that we're used to in web apps. However, since we're using Android and its natively installed wallets, the flow to connect and utilize them is a bit different. Most notably, we need to follow the MWA protocol. @@ -570,14 +570,14 @@ We do this by providing the following in our `AuthProvider`: - `deauthorizeSession(wallet)`: Deauthorizes the `wallet`. - `onChangeAccount`: Acts as a handler when `selectedAccount` is changed. -We’re also going to throw in some utility methods: +We're also going to throw in some utility methods: - `getPublicKeyFromAddress(base64Address)`: Creates a new Public Key object from the Base64 address given from the `wallet` object - `getAuthorizationFromAuthResult`: Handles the authorization result, extracts relevant data from the result, and returns the `Authorization` context object -We’ll expose all of this through a `useAuthorization` hook. +We'll expose all of this through a `useAuthorization` hook. Since this provider is the same across virtually all apps, we're going to give you the full implementation that you can copy/paste. We'll dig into the details @@ -877,7 +877,7 @@ export const useProgram = () => useContext(ProgramContext); #### 7. Modify App.tsx -Now that we have all our providers, let’s wrap our app with them. We're going to +Now that we have all our providers, let's wrap our app with them. We're going to re-write the default `App.tsx` with the following changes: - Import our providers and add in our polyfills @@ -923,7 +923,7 @@ export default function App() { #### 8. Create MainScreen.tsx -Now, let’s put everything together to create our UI. Create a new folder called +Now, let's put everything together to create our UI. Create a new folder called `screens` and a new file called `MainScreen.tsx` inside of it. In this file, we are only structuring the screen to display two yet-to-be-created components: `CounterView` and `CounterButton`. @@ -979,7 +979,7 @@ export function MainScreen() { The `CounterView` is the first of our two program-specific files. `CounterView`'s only job is to fetch and listen for updates on our `Counter` -account. Since we’re only listening here, we don’t have to do anything +account. Since we're only listening here, we don't have to do anything MWA-related. It should look identical to a web application. We'll use our `Connection` object to listen for the `programAddress` specified in `ProgramProvider.tsx`. When the account is changed, we update the UI. @@ -1183,7 +1183,7 @@ export function CounterButton() { #### 11. Build and Run -Now it’s time to test that everything works! Build and run with the following +Now it's time to test that everything works! Build and run with the following command: ```bash @@ -1205,7 +1205,7 @@ to fix them: wallet installed ( like the fake wallet we installed in Prerequisites ) - You get stuck in a forever loop while calling `increment` → This is likely due to you reaching a Devnet airdrop rate limit. Take out the airdrop section in - `CounterButton` and manually send some Devnet sol to your wallet’s address + `CounterButton` and manually send some Devnet sol to your wallet's address (printed in the console) That's it! You've made your first Solana Mobile dApp. If you get stuck, feel diff --git a/content/courses/mobile/mwa-deep-dive.md b/content/courses/mobile/mwa-deep-dive.md index db3ed3a55..08d232afe 100644 --- a/content/courses/mobile/mwa-deep-dive.md +++ b/content/courses/mobile/mwa-deep-dive.md @@ -255,7 +255,7 @@ transact(async (wallet: Web3MobileWallet) => { Every time you want to call these methods, you will have to call `wallet.authorize()` or `wallet.reauthorize()`. -When invoking `wallet.signAndSendTransactions(...)`, it’s essential to handle +When invoking `wallet.signAndSendTransactions(...)`, it's essential to handle transaction failures gracefully. Transactions can fail due to various reasons such as network issues, signature mismatches, or insufficient funds. Proper error handling ensures a smooth user experience, even when the transaction diff --git a/content/courses/mobile/solana-mobile-dapps-with-expo.md b/content/courses/mobile/solana-mobile-dapps-with-expo.md index bf876a873..16ee816f3 100644 --- a/content/courses/mobile/solana-mobile-dapps-with-expo.md +++ b/content/courses/mobile/solana-mobile-dapps-with-expo.md @@ -304,7 +304,7 @@ eas login #### 2. Create the app scaffold -Let’s create our app with the following: +Let's create our app with the following: ```bash npx create-expo-app -t expo-template-blank-typescript solana-expo diff --git a/content/courses/native-onchain-development/cross-program-invocations.md b/content/courses/native-onchain-development/cross-program-invocations.md index a20e7999c..dff3df609 100644 --- a/content/courses/native-onchain-development/cross-program-invocations.md +++ b/content/courses/native-onchain-development/cross-program-invocations.md @@ -223,7 +223,7 @@ if a signature is required on behalf of a PDA. For that, you'll need to use Using `invoke_signed` is a little different just because there is an additional field that requires the seeds used to derive any PDAs that must sign the transaction. You may recall from previous lessons that PDAs do not lie on the -Ed25519 curve and, therefore, do not have a corresponding secret key. You’ve +Ed25519 curve and, therefore, do not have a corresponding secret key. You've been told that programs can provide signatures for their PDAs, but have not learned how that actually happens - until now. Programs provide signatures for their PDAs with the `invoke_signed` function. The first two fields of @@ -259,9 +259,9 @@ signer. #### Security checks There are some common mistakes and things to remember when utilizing CPIs that -are important to your program’s security and robustness. The first thing to +are important to your program's security and robustness. The first thing to remember is that, as we know by now, we have no control over what information is -passed into our programs. For this reason, it’s important to always verify the +passed into our programs. For this reason, it's important to always verify the `program_id`, accounts, and data passed into the CPI. Without these security checks, someone could submit a transaction that invokes an instruction on a completely different program than was expected, which is not ideal. @@ -269,7 +269,7 @@ completely different program than was expected, which is not ideal. Fortunately, there are inherent checks on the validity of any PDAs that are marked as signers within the `invoke_signed` function. All other accounts and `instruction_data` should be verified somewhere in your program code before -making the CPI. It's also important to make sure you’re targeting the intended +making the CPI. It's also important to make sure you're targeting the intended instruction on the program you are invoking. The easiest way to do this is to read the source code of the program you will be invoking just as you would if you were constructing an instruction from the client side. @@ -319,11 +319,11 @@ To see this in action, view this CPIs are a very important feature of the Solana ecosystem and they make all programs deployed interoperable with each other. With CPIs there is no need to re-invent the wheel when it comes to development. This creates the opportunity -for building new protocols and applications on top of what’s already been built, -just like building blocks or Lego bricks. It’s important to remember that CPIs +for building new protocols and applications on top of what's already been built, +just like building blocks or Lego bricks. It's important to remember that CPIs are a two-way street and the same is true for any programs that you deploy! If you build something cool and useful, developers have the ability to build on top -of what you’ve done or just plug your protocol into whatever it is that they are +of what you've done or just plug your protocol into whatever it is that they are building. Composability is a big part of what makes crypto so unique and CPIs are what makes this possible on Solana. @@ -345,7 +345,7 @@ gone through prior lessons, the Movie Review program allows users to submit movie reviews and have them stored in PDA accounts. Last lesson, we added the ability to leave comments on other movie reviews using -PDAs. In this lesson, we’re going to work on having the program mint tokens to +PDAs. In this lesson, we're going to work on having the program mint tokens to the reviewer or commenter anytime a review or comment is submitted. To implement this, we'll have to invoke the SPL Token Program's `MintTo` @@ -357,7 +357,7 @@ forward with this lab. #### 1. Get starter code and add dependencies To get started, we will be using the final state of the Movie Review program -from the previous PDA lesson. So, if you just completed that lesson then you’re +from the previous PDA lesson. So, if you just completed that lesson then you're all set and ready to go. If you are just jumping in here, no worries, you can [download the starter code here](https://github.com/Unboxed-Software/solana-movie-program/tree/solution-add-comments). We'll be using the `solution-add-comments` branch as our starting point. @@ -386,7 +386,7 @@ to be passed in: - `token_mint` - the mint address of the token - `mint_auth` - address of the authority of the token mint -- `user_ata` - user’s associated token account for this mint (where the tokens +- `user_ata` - user's associated token account for this mint (where the tokens will be minted) - `token_program` - address of the token program @@ -414,7 +414,7 @@ let token_program = next_account_info(account_info_iter)?; There is no additional `instruction_data` required for the new functionality, so no changes need to be made to how data is deserialized. The only additional -information that’s needed is the extra accounts. +information that's needed is the extra accounts. #### 4. Mint tokens to the reviewer in `add_movie_review` @@ -429,7 +429,7 @@ use spl_token::{instruction::initialize_mint, ID as TOKEN_PROGRAM_ID}; ``` Now we can move on to the logic that handles the actual minting of the tokens! -We’ll be adding this to the very end of the `add_movie_review` function right +We'll be adding this to the very end of the `add_movie_review` function right before `Ok(())` is returned. Minting tokens requires a signature by the mint authority. Since the program @@ -538,7 +538,7 @@ will mint ten tokens to the reviewer when a review is created. #### 5. Repeat for `add_comment` Our updates to the `add_comment` function will be almost identical to what we -did for the `add_movie_review` function above. The only difference is that we’ll +did for the `add_movie_review` function above. The only difference is that we'll change the amount of tokens minted for a comment from ten to five so that adding reviews are weighted above commenting. First, update the accounts with the same four additional accounts as in the `add_movie_review` function. @@ -799,7 +799,7 @@ pub fn initialize_token_mint(program_id: &Pubkey, accounts: &[AccountInfo]) -> P #### 7. Build and deploy -Now we’re ready to build and deploy our program! You can build the program by +Now we're ready to build and deploy our program! You can build the program by running `cargo build-bpf` and then running the command that is returned, it should look something like `solana program deploy `. diff --git a/content/courses/native-onchain-development/deserialize-custom-data-frontend.md b/content/courses/native-onchain-development/deserialize-custom-data-frontend.md index 69fc63fd4..a9a57d73a 100644 --- a/content/courses/native-onchain-development/deserialize-custom-data-frontend.md +++ b/content/courses/native-onchain-development/deserialize-custom-data-frontend.md @@ -3,7 +3,7 @@ title: Deserialize Program Data objectives: - Explain Program Derived Accounts - Derive PDAs given specific seeds - - Fetch a program’s accounts + - Fetch a program's accounts - Use Borsh to deserialize custom data description: Deserialize instructions in JS/TS clients to send to your native program. @@ -23,7 +23,7 @@ description: ## Lesson In the last lesson, we serialized program data that was subsequently stored -onchain by a Solana program. In this lesson, we’ll cover in greater detail how +onchain by a Solana program. In this lesson, we'll cover in greater detail how programs store data on the chain, how to retrieve data, and how to deserialize the data they store. @@ -51,7 +51,7 @@ can be signed for by the program address used to create them. PDAs and the data inside them can be consistently found based on the program address, bump, and seeds. To find a PDA, the program ID and seeds of the -developer’s choice (like a string of text) are passed through the +developer's choice (like a string of text) are passed through the [`findProgramAddress()`](https://solana-labs.github.io/solana-web3.js/classes/PublicKey.html#findProgramAddress) function. @@ -75,10 +75,10 @@ const [pda, bump] = await findProgramAddress( ##### Example: program with user-specific data -In programs that store user-specific data, it’s common to use a user’s public -key as the seed. This separates each user’s data into its own PDA. The -separation makes it possible for the client to locate each user’s data by -finding the address using the program ID and the user’s public key. +In programs that store user-specific data, it's common to use a user's public +key as the seed. This separates each user's data into its own PDA. The +separation makes it possible for the client to locate each user's data by +finding the address using the program ID and the user's public key. ```typescript import { PublicKey } from "@solana/web3.js"; @@ -94,8 +94,8 @@ const [pda, bump] = await PublicKey.findProgramAddressSync( When there are multiple data items per user, a program may use more seeds to create and identify accounts. For example, in a note-taking app there may be one -account per note where each PDA is derived with the user’s public key and the -note’s title. +account per note where each PDA is derived with the user's public key and the +note's title. ```typescript const [pda, bump] = await PublicKey.findProgramAddressSync( @@ -138,9 +138,9 @@ fetchProgramAccounts(); ### Deserializing program data The `data` property on an `AccountInfo` object is a buffer. To use it -efficiently, you’ll need to write code that deserializes it into something more +efficiently, you'll need to write code that deserializes it into something more usable. This is similar to the serialization process we covered last lesson. -Just as before, we’ll use [Borsh](https://borsh.io/) and `@coral-xyz/borsh`. If +Just as before, we'll use [Borsh](https://borsh.io/) and `@coral-xyz/borsh`. If you need a refresher on either of these, have a look at the previous lesson. Deserializing requires knowledge of the account layout ahead of time. When @@ -180,22 +180,22 @@ const { playerId, name } = borshAccountSchema.decode(buffer); ## Lab -Let’s practice this together by continuing to work on the Movie Review app from -the last lesson. No worries if you’re just jumping into this lesson - it should +Let's practice this together by continuing to work on the Movie Review app from +the last lesson. No worries if you're just jumping into this lesson - it should be possible to follow either way. As a refresher, this project uses a Solana program deployed on Devnet which lets users review movies. Last lesson, we added functionality to the frontend skeleton letting users submit movie reviews but the list of reviews is still -showing mock data. Let’s fix that by fetching the program’s storage accounts and +showing mock data. Let's fix that by fetching the program's storage accounts and deserializing the data stored there. ![movie review frontend](/public/assets/courses/movie-review-frontend-dapp.png) #### 1. Download the starter code -If you didn’t complete the lab from the last lesson or just want to make sure -that you didn’t miss anything, you can download the +If you didn't complete the lab from the last lesson or just want to make sure +that you didn't miss anything, you can download the [starter code](https://github.com/solana-developers/movie-review-frontend/tree/solution-serialize-instruction-data). The project is a fairly simple Next.js application. It includes the @@ -205,7 +205,7 @@ list, a `Form` component for submitting a new review, and a `Movie.ts` file that contains a class definition for a `Movie` object. Note that when you run `npm run dev`, the reviews displayed on the page are -mocks. We’ll be swapping those out for the real deal. +mocks. We'll be swapping those out for the real deal. #### 2. Create the buffer layout @@ -226,7 +226,7 @@ PDA's `data`: 3. `title` as a string representing the title of the reviewed movie. 4. `description` as a string representing the written portion of the review. -Let’s configure a `borsh` layout in the `Movie` class to represent the movie +Let's configure a `borsh` layout in the `Movie` class to represent the movie account data layout. Start by importing `@coral-xyz/borsh`. Next, create a `borshAccountSchema` static property and set it to the appropriate `borsh` struct containing the properties listed above. @@ -255,7 +255,7 @@ structured. #### 3. Create a method to deserialize data -Now that we have the buffer layout set up, let’s create a static method in +Now that we have the buffer layout set up, let's create a static method in `Movie` called `deserialize` that will take an optional `Buffer` and return a `Movie` object or `null`. @@ -296,7 +296,7 @@ export class Movie { ``` The method first checks whether or not the buffer exists and returns `null` if -it doesn’t. Next, it uses the layout we created to decode the buffer, then uses +it doesn't. Next, it uses the layout we created to decode the buffer, then uses the data to construct and return an instance of `Movie`. If the decoding fails, the method logs the error and returns `null`. @@ -352,7 +352,7 @@ At this point, you should be able to run the app and see the list of movie reviews retrieved from the program! Depending on how many reviews have been submitted, this may take a long time to -load or may lock up your browser entirely. But don’t worry — next lesson we’ll +load or may lock up your browser entirely. But don't worry — next lesson we'll learn how to page and filter accounts so you can be more surgical with what you load. @@ -363,7 +363,7 @@ before continuing. ## Challenge -Now it’s your turn to build something independently. Last lesson, you worked on +Now it's your turn to build something independently. Last lesson, you worked on the Student Intros app to serialize instruction data and send a new intro to the network. Now, it's time to fetch and deserialize the program's account data. Remember, the Solana program that supports this is at diff --git a/content/courses/native-onchain-development/deserialize-instruction-data.md b/content/courses/native-onchain-development/deserialize-instruction-data.md index 5cf2b27b4..ddd5aa918 100644 --- a/content/courses/native-onchain-development/deserialize-instruction-data.md +++ b/content/courses/native-onchain-development/deserialize-instruction-data.md @@ -66,7 +66,7 @@ let mut mutable_age = 33; mutable_age = 34; ``` -The Rust compiler ensures that immutable variables cannot change, so you don’t +The Rust compiler ensures that immutable variables cannot change, so you don't have to track it yourself. This makes your code easier to reason through and simplifies debugging. @@ -86,7 +86,7 @@ struct User { } ``` -To use a struct after it’s defined, create an instance of the struct by +To use a struct after it's defined, create an instance of the struct by specifying concrete values for each of the fields. ```rust @@ -364,8 +364,8 @@ There is Rust syntax in this function that we haven't explained yet. The leaves the `Ok` value unchanged. - [`?` operator](https://doc.rust-lang.org/rust-by-example/error/result/enter_question_mark.html): - Unwraps a `Result` or `Option`. If it’s `Ok` or `Some`, it returns the value. - If it’s an `Err` or `None`, it propagates the error up to the calling + Unwraps a `Result` or `Option`. If it's `Ok` or `Some`, it returns the value. + If it's an `Err` or `None`, it propagates the error up to the calling function. ### Program logic @@ -441,7 +441,7 @@ pub enum NoteInstruction { ... } ## Lab -For this lesson’s lab, you'll build the first half of the Movie Review program +For this lesson's lab, you'll build the first half of the Movie Review program from Module 1, focusing on deserializing instruction data. The next lesson will cover the remaining implementation. diff --git a/content/courses/native-onchain-development/paging-ordering-filtering-data-frontend.md b/content/courses/native-onchain-development/paging-ordering-filtering-data-frontend.md index cf84f50d6..2d9a4137d 100644 --- a/content/courses/native-onchain-development/paging-ordering-filtering-data-frontend.md +++ b/content/courses/native-onchain-development/paging-ordering-filtering-data-frontend.md @@ -3,7 +3,7 @@ title: Page, Order, and Filter Program Data objectives: - Page, order, and filter accounts - Prefetch accounts without data - - Determine where in an account’s buffer layout specific data is stored + - Determine where in an account's buffer layout specific data is stored - Prefetch accounts with a subset of data that can be used to order accounts - Fetch only accounts whose data matches specific criteria - Fetch a subset of total accounts using `getMultipleAccounts` @@ -22,8 +22,8 @@ description: "Learn how to efficiently query account data from Solana." ## Lesson You may have noticed in the last lesson that while we could fetch and display a -list of account data, we didn’t have any granular control over how many accounts -to fetch or their order. In this lesson, we’ll learn about some configuration +list of account data, we didn't have any granular control over how many accounts +to fetch or their order. In this lesson, we'll learn about some configuration options for the `getProgramAccounts` function that will enable things like paging, ordering accounts, and filtering. @@ -47,7 +47,7 @@ only return the subset of the data buffer that you specified. #### Paging Accounts One area where this becomes helpful is with paging. If you want to have a list -that displays all accounts but there are so many accounts that you don’t want to +that displays all accounts but there are so many accounts that you don't want to pull all the data at once, you can fetch all of the accounts but not fetch their data by using a `dataSlice` of `{ offset: 0, length: 0 }`. You can then map the result to a list of account keys whose data you can fetch only when needed. @@ -74,7 +74,7 @@ const deserializedObjects = accountInfos.map(accountInfo => { #### Ordering Accounts The `dataSlice` option is also helpful when you need to order a list of accounts -while paging. You still don’t want to fetch all the data at once, but you do +while paging. You still don't want to fetch all the data at once, but you do need all of the keys and a way to order them upfront. In this case, you need to understand the layout of the account data and configure the data slice to only be the data you need to use for ordering. @@ -86,7 +86,7 @@ For example, you might have an account that stores contact information like so: - `firstName` as a string - `secondName` as a string -If you want to order all of the account keys alphabetically based on the user’s +If you want to order all of the account keys alphabetically based on the user's first name, you need to find out the offset where the name starts. The first field, `initialized`, takes the first byte, then `phoneNumber` takes another 8, so the `firstName` field starts at offset `1 + 8 = 9`. However, dynamic data @@ -94,12 +94,12 @@ fields in borsh use the first 4 bytes to record the length of the data, so we can skip an additional 4 bytes, making the offset 13. You then need to determine the length to make the data slice. Since the length -is variable, we can’t know for sure before fetching the data. But you can choose +is variable, we can't know for sure before fetching the data. But you can choose a length that is large enough to cover most cases and short enough to not be too much of a burden to fetch. 15 bytes is plenty for most first names but would result in a small enough download even with a million users. -Once you’ve fetched accounts with the given data slice, you can use the `sort` +Once you've fetched accounts with the given data slice, you can use the `sort` method to sort the array before mapping it to an array of public keys. ```tsx @@ -161,7 +161,7 @@ accounts.sort((a, b) => { const accountKeys = accounts.map(account => account.pubkey); ``` -Note that in the snippet above we don’t compare the data as given. This is +Note that in the snippet above we don't compare the data as given. This is because for dynamically sized types like strings, Borsh places an unsigned, 32-bit (4 byte) integer at the start to indicate the length of the data representing that field. So to compare the first names directly, we need to get @@ -171,7 +171,7 @@ proper length. ### Use `filters` to only retrieve specific accounts Limiting the data received per account is great, but what if you only want to -return accounts that match a specific criteria rather than all of them? That’s +return accounts that match a specific criteria rather than all of them? That's where the `filters` configuration option comes in. This option is an array that can have objects matching the following: @@ -233,19 +233,19 @@ async function fetchMatchingContactAccounts( Two things to note in the example above: -1. We’re setting the offset to 13 because we determined previously that the +1. We're setting the offset to 13 because we determined previously that the offset for `firstName` in the data layout is 9 and we want to additionally skip the first 4 bytes indicating the length of the string. -2. We’re using a third-party library +2. We're using a third-party library `bs58`` to perform base-58 encoding on the search term. You can install it using `npm install bs58`. ## Lab -Remember that Movie Review app we worked on in the last two lessons? We’re going +Remember that Movie Review app we worked on in the last two lessons? We're going to spice it up a little by paging the review list, ordering the reviews so they -aren’t so random, and adding some basic search functionality. No worries if -you’re just jumping into this lesson without having looked at the previous +aren't so random, and adding some basic search functionality. No worries if +you're just jumping into this lesson without having looked at the previous ones - as long as you have the prerequisite knowledge, you should be able to follow the lab without having worked in this specific project yet. @@ -253,8 +253,8 @@ follow the lab without having worked in this specific project yet. #### **1. Download the starter code** -If you didn’t complete the lab from the last lesson or just want to make sure -that you didn’t miss anything, you can download the +If you didn't complete the lab from the last lesson or just want to make sure +that you didn't miss anything, you can download the [starter code](https://github.com/solana-developers/movie-review-frontend/tree/solutions-deserialize-account-data). The project is a fairly simple Next.js application. It includes the @@ -265,10 +265,10 @@ contains a class definition for a `Movie` object. #### 2. Add paging to the reviews -First things first, let’s create a space to encapsulate the code for fetching +First things first, let's create a space to encapsulate the code for fetching account data. Create a new file `MovieCoordinator.ts` and declare a -`MovieCoordinator` class. Then let’s move the `MOVIE_REVIEW_PROGRAM_ID` constant -from `MovieList` into this new file since we’ll be moving all references to it +`MovieCoordinator` class. Then let's move the `MOVIE_REVIEW_PROGRAM_ID` constant +from `MovieList` into this new file since we'll be moving all references to it ```tsx const MOVIE_REVIEW_PROGRAM_ID = "CenYq6bDRB7p73EjsPEpiYN7uveyPUTdXkDkgUduboaN"; @@ -281,11 +281,11 @@ note before we dive in: this will be as simple a paging implementation as possible so that we can focus on the complex part of interacting with Solana accounts. You can, and should, do better for a production application. -With that out of the way, let’s create a static property `accounts` of type +With that out of the way, let's create a static property `accounts` of type `web3.PublicKey[]`, a static function `prefetchAccounts(connection: web3.Connection)`, and a static function `fetchPage(connection: web3.Connection, page: number, perPage: number): Promise>`. -You’ll also need to import `@solana/web3.js` and `Movie`. +You'll also need to import `@solana/web3.js` and `Movie`. ```tsx import { Connection, PublicKey, AccountInfo } from "@solana/web3.js"; @@ -306,7 +306,7 @@ export class MovieCoordinator { } ``` -The key to paging is to prefetch all the accounts without data. Let’s fill in +The key to paging is to prefetch all the accounts without data. Let's fill in the body of `prefetchAccounts` to do this and set the retrieved public keys to the static `accounts` property. @@ -327,8 +327,8 @@ static async prefetchAccounts(connection: Connection) { } ``` -Now, let’s fill in the `fetchPage` method. First, if the accounts haven’t been -prefetched yet, we’ll need to do that. Then, we can get the account public keys +Now, let's fill in the `fetchPage` method. First, if the accounts haven't been +prefetched yet, we'll need to do that. Then, we can get the account public keys that correspond to the requested page and call `connection.getMultipleAccountsInfo`. Finally, we deserialize the account data and return the corresponding `Movie` objects. @@ -433,7 +433,7 @@ At this point, you should be able to run the project and click between pages! #### 3. Order reviews alphabetically by title -If you look at the reviews, you might notice they aren’t in any specific order. +If you look at the reviews, you might notice they aren't in any specific order. We can fix this by adding back just enough data into our data slice to help us do some sorting. The various properties in the movie review data buffer are laid out as follows @@ -445,17 +445,17 @@ out as follows Based on this, the offset we need to provide to the data slice to access `title` is 2. The length, however, is indeterminate, so we can just provide what seems -to be a reasonable length. I’ll stick with 18 as that will cover the length of +to be a reasonable length. I'll stick with 18 as that will cover the length of most titles without fetching too much data every time. -Once we’ve modified the data slice in `getProgramAccounts`, we then need to +Once we've modified the data slice in `getProgramAccounts`, we then need to actually sort the returned array. To do this, we need to compare the part of the data buffer that actually corresponds to `title`. The first 4 bytes of a dynamic field in Borsh are used to store the length of the field in bytes. So in any given buffer `data` that is sliced the way we discussed above, the string portion is `data.slice(4, 4 + data[0])`. -Now that we’ve thought through this, let’s modify the implementation of +Now that we've thought through this, let's modify the implementation of `prefetchAccounts` in `MovieCoordinator`: ```tsx @@ -517,15 +517,15 @@ reviews ordered alphabetically. #### 4. Add search -The last thing we’ll do to improve this app is to add some basic search -capability. Let’s add a `search` parameter to `prefetchAccounts` and reconfigure +The last thing we'll do to improve this app is to add some basic search +capability. Let's add a `search` parameter to `prefetchAccounts` and reconfigure the body of the function to use it. We can use the `filters` property of the `config` parameter of `getProgramAccounts` to filter accounts by specific data. The offset to the `title` fields is 2, but the first 4 bytes are the length of the title so the actual offset to the string itself is 6. Remember that the bytes need to be base -58 encoded, so let’s install and import `bs58`. +58 encoded, so let's install and import `bs58`. ```tsx import bs58 from 'bs58' @@ -580,7 +580,7 @@ static async prefetchAccounts(connection: Connection, search: string) { ``` Now, add a `search` parameter to `fetchPage` and update its call to -`prefetchAccounts` to pass it along. We’ll also need to add a `reload` boolean +`prefetchAccounts` to pass it along. We'll also need to add a `reload` boolean parameter to `fetchPage` so that we can force a refresh of the account prefetching every time the search value changes. @@ -625,7 +625,7 @@ static async fetchPage( } ``` -With that in place, let’s update the code in `MovieList` to call this properly. +With that in place, let's update the code in `MovieList` to call this properly. First, add `const [search, setSearch] = useState('')` near the other `useState` calls. Then update the call to `MovieCoordinator.fetchPage` in the `useEffect` @@ -673,7 +673,7 @@ return ( ); ``` -And that’s it! The app now has ordered reviews, paging, and search. +And that's it! The app now has ordered reviews, paging, and search. That was a lot to digest, but you made it through. If you need to spend some more time with the concepts, feel free to reread the sections that were most @@ -682,7 +682,7 @@ challenging for you and/or have a look at the ## Challenge -Now it’s your turn to try and do this on your own. Using the Student Intros app +Now it's your turn to try and do this on your own. Using the Student Intros app from last lesson, add paging, ordering alphabetically by name, and searching by name. @@ -691,9 +691,9 @@ name. 1. You can build this from scratch or you can download the [starter code](https://github.com/solana-developers/solana-student-intro-frontend/tree/solution-deserialize-account-data) 2. Add paging to the project by prefetching accounts without data, then only - fetching the account data for each account when it’s needed. + fetching the account data for each account when it's needed. 3. Order the accounts displayed in the app alphabetically by name. -4. Add the ability to search through introductions by a student’s name. +4. Add the ability to search through introductions by a student's name. This is challenging. If you get stuck, feel free to reference the [solution code](https://github.com/solana-developers/solana-student-intro-frontend/tree/solution-paging-account-data). diff --git a/content/courses/native-onchain-development/program-security.md b/content/courses/native-onchain-development/program-security.md index bdf694fcd..1bbba4de8 100644 --- a/content/courses/native-onchain-development/program-security.md +++ b/content/courses/native-onchain-development/program-security.md @@ -39,7 +39,7 @@ trying to exploit your program, anticipating failure points is essential to secure program development. Remember, **you have no control over the transactions that will be sent to your -program once it’s deployed**. You can only control how your program handles +program once it's deployed**. You can only control how your program handles them. While this lesson is far from a comprehensive overview of program security, we'll cover some of the basic pitfalls to look out for. @@ -281,8 +281,8 @@ To avoid integer overflow and underflow, either: ## Lab -Let’s practice together with the Movie Review program we've worked on in -previous lessons. No worries if you’re just jumping into this lesson without +Let's practice together with the Movie Review program we've worked on in +previous lessons. No worries if you're just jumping into this lesson without having done the previous lesson - it should be possible to follow along either way. @@ -325,7 +325,7 @@ Since we'll be allowing updates to movie reviews, we also changed `account_len` in the `add_movie_review` function (now in `processor.rs`). Instead of calculating the size of the review and setting the account length to only as large as it needs to be, we're simply going to allocate 1000 bytes to each -review account. This way, we don’t have to worry about reallocating size or +review account. This way, we don't have to worry about reallocating size or re-calculating rent when a user updates their movie review. We went from this: @@ -343,7 +343,7 @@ let account_len: usize = 1000; The [realloc](https://docs.rs/solana-sdk/latest/solana_sdk/account_info/struct.AccountInfo.html#method.realloc) method was just recently enabled by Solana Labs which allows you to dynamically change the size of your accounts. We will not be using this method for this lab, but -it’s something to be aware of. +it's something to be aware of. Finally, we've also implemented some additional functionality for our `MovieAccountState` struct in `state.rs` using the `impl` keyword. @@ -418,7 +418,7 @@ Note that in addition to adding the error cases, we also added the implementation that lets us convert our error into a `ProgramError` type as needed. -Before moving on, let’s bring `ReviewError` into scope in the `processor.rs`. We +Before moving on, let's bring `ReviewError` into scope in the `processor.rs`. We will be using these errors shortly when we add our security checks. ```rust @@ -455,8 +455,8 @@ if !initializer.is_signer { Next, let's make sure the `pda_account` passed in by the user is the `pda` we expect. Recall we derived the `pda` for a movie review using the `initializer` -and `title` as seeds. Within our instruction we’ll derive the `pda` again and -then check if it matches the `pda_account`. If the addresses do not match, we’ll +and `title` as seeds. Within our instruction we'll derive the `pda` again and +then check if it matches the `pda_account`. If the addresses do not match, we'll return our custom `InvalidPDA` error. ```rust @@ -474,7 +474,7 @@ if pda != *pda_account.key { Now let's perform some data validation. We'll start by making sure `rating` falls within the 1 to 5 scale. If the rating -provided by the user outside of this range, we’ll return our custom +provided by the user outside of this range, we'll return our custom `InvalidRating` error. ```rust @@ -484,8 +484,8 @@ if rating > 5 || rating < 1 { } ``` -Next, let’s check that the content of the review does not exceed the 1000 bytes -we’ve allocated for the account. If the size exceeds 1000 bytes, we’ll return +Next, let's check that the content of the review does not exceed the 1000 bytes +we've allocated for the account. If the size exceeds 1000 bytes, we'll return our custom `InvalidDataLength` error. ```rust @@ -597,7 +597,7 @@ pub fn add_movie_review( Now that `add_movie_review` is more secure, let's turn our attention to supporting the ability to update a movie review. -Let’s begin by updating `instruction.rs`. We’ll start by adding an +Let's begin by updating `instruction.rs`. We'll start by adding an `UpdateMovieReview` variant to `MovieInstruction` that includes embedded data for the new title, rating, and description. @@ -730,11 +730,11 @@ if pda_account.owner != program_id { #### Signer Check -Next, let’s perform a signer check to verify that the `initializer` of the +Next, let's perform a signer check to verify that the `initializer` of the update instruction has also signed the transaction. Since we are updating the data for a movie review, we want to ensure that the original `initializer` of the review has approved the changes by signing the transaction. If the -`initializer` did not sign the transaction, we’ll return an error. +`initializer` did not sign the transaction, we'll return an error. ```rust if !initializer.is_signer { @@ -745,9 +745,9 @@ if !initializer.is_signer { #### Account Validation -Next, let’s check that the `pda_account` passed in by the user is the PDA we +Next, let's check that the `pda_account` passed in by the user is the PDA we expect by deriving the PDA using `initializer` and `title` as seeds. If the -addresses do not match, we’ll return our custom `InvalidPDA` error. We'll +addresses do not match, we'll return our custom `InvalidPDA` error. We'll implement this the same way we did in the `add_movie_review` function. ```rust @@ -787,7 +787,7 @@ if !account_data.is_initialized() { Next, we need to validate the `rating`, `title`, and `description` data just like in the `add_movie_review` function. We want to limit the `rating` to a scale of 1 to 5 and limit the overall size of the review to be fewer than 1000 -bytes. If the rating provided by the user outside of this range, then we’ll +bytes. If the rating provided by the user outside of this range, then we'll return our custom `InvalidRating` error. If the review is too long, then we'll return our custom `InvalidDataLength` error. @@ -912,7 +912,7 @@ continuing. ## Challenge -Now it’s your turn to build something independently by building on top of the +Now it's your turn to build something independently by building on top of the Student Intro program that you've used in previous lessons. If you haven't been following along or haven't saved your code from before, feel free to use [this starter code](https://beta.solpg.io/62b11ce4f6273245aca4f5b2). diff --git a/content/courses/native-onchain-development/program-state-management.md b/content/courses/native-onchain-development/program-state-management.md index 1f858b9f6..f8dd25ff8 100644 --- a/content/courses/native-onchain-development/program-state-management.md +++ b/content/courses/native-onchain-development/program-state-management.md @@ -554,7 +554,7 @@ program. `movieProgramId` in the `index.ts` component with the public key of the program you've deployed. - If you use the frontend, simply replace the `MOVIE_REVIEW_PROGRAM_ID` in the - `review-form.tsx` components with the address of the program you’ve deployed. + `review-form.tsx` components with the address of the program you've deployed. Then run the frontend, submit a view, and refresh the browser to see the review. If you need more time with this project to feel comfortable with these concepts, diff --git a/content/courses/native-onchain-development/serialize-instruction-data-frontend.md b/content/courses/native-onchain-development/serialize-instruction-data-frontend.md index 798a3ac0d..410b2abf2 100644 --- a/content/courses/native-onchain-development/serialize-instruction-data-frontend.md +++ b/content/courses/native-onchain-development/serialize-instruction-data-frontend.md @@ -27,7 +27,7 @@ description: How to deserialize data fetched from Solana accounts. buffer. To facilitate this process of serialization, we will be using [Borsh](https://borsh.io/). - Transactions can fail to be processed by the blockchain for any number of - reasons, we’ll discuss some of the most common ones here. + reasons, we'll discuss some of the most common ones here. ## Lesson @@ -86,14 +86,14 @@ if every instruction succeeds then the transaction as a whole will be successful, but if a single instruction fails then the entire transaction will fail immediately with no side-effects. -The account array is not just an array of the accounts’ public keys. Each object -in the array includes the account’s public key, whether or not it is a signer on +The account array is not just an array of the accounts' public keys. Each object +in the array includes the account's public key, whether or not it is a signer on the transaction, and whether or not it is writable. Including whether or not an account is writable during the execution of an instruction allows the runtime to facilitate parallel processing of smart contracts. Because you must define which accounts are read-only and which you will write to, the runtime can determine which transactions are non-overlapping or read-only and allow them to execute -concurrently. To learn more about Solana’s runtime, check out this +concurrently. To learn more about Solana's runtime, check out this [blog post on Sealevel](https://solana.com/news/sealevel---parallel-processing-thousands-of-smart-contracts). #### Instruction Data @@ -104,17 +104,17 @@ an HTTP request lets you build dynamic and flexible REST APIs. Just as the structure of the body of an HTTP request is dependent on the endpoint you intend to call, the structure of the byte buffer used as -instruction data is entirely dependent on the recipient program. If you’re -building a full-stack dApp on your own, then you’ll need to copy the same +instruction data is entirely dependent on the recipient program. If you're +building a full-stack dApp on your own, then you'll need to copy the same structure that you used when building the program over to the client-side code. -If you’re working with another developer who is handling the program +If you're working with another developer who is handling the program development, you can coordinate to ensure matching buffer layouts. -Let’s think about a concrete example. Imagine working on a Web3 game and being +Let's think about a concrete example. Imagine working on a Web3 game and being responsible for writing client-side code that interacts with a player inventory program. The program was designed to allow the client to: -- Add inventory based on a player’s game-play results +- Add inventory based on a player's game-play results - Transfer inventory from one player to another - Equip a player with selected inventory items @@ -125,11 +125,11 @@ Each program, however, only has one entry point. You would instruct the program on which of these functions to run through the instruction data. You would also include in the instruction data any information the function -needs to execute properly, e.g. an inventory item’s ID, a player to transfer +needs to execute properly, e.g. an inventory item's ID, a player to transfer inventory to, etc. Exactly _how_ this data would be structured would depend on how the program was -written, but it’s common to have the first field in instruction data be a number +written, but it's common to have the first field in instruction data be a number that the program can map to a function, after which additional fields act as function arguments. @@ -145,10 +145,10 @@ in Solana is [Borsh](https://borsh.io). Per the website: Borsh maintains a [JS library](https://github.com/near/borsh-js) that handles serializing common types into a buffer. There are also other packages built on -top of Borsh that try to make this process even easier. We’ll be using the +top of Borsh that try to make this process even easier. We'll be using the `@coral-xyz/borsh` library which can be installed using `npm`. -Building off of the previous game inventory example, let’s look at a +Building off of the previous game inventory example, let's look at a hypothetical scenario where we are instructing the program to equip a player with a given item. Assume the program is designed to accept a buffer that represents a struct with the following properties: @@ -176,9 +176,9 @@ const equipPlayerSchema = borsh.struct([ You can then encode data using this schema with the `encode` method. This method accepts as arguments an object representing the data to be serialized and a -buffer. In the below example, we allocate a new buffer that’s much larger than +buffer. In the below example, we allocate a new buffer that's much larger than needed, then encode the data into that buffer and slice the original buffer down -into a new buffer that’s only as large as needed. +into a new buffer that's only as large as needed. ```typescript import * as borsh from "@coral-xyz/borsh"; @@ -198,13 +198,13 @@ equipPlayerSchema.encode( const instructionBuffer = buffer.subarray(0, equipPlayerSchema.getSpan(buffer)); ``` -Once a buffer is properly created and the data serialized, all that’s left is -building the transaction. This is similar to what you’ve done in previous +Once a buffer is properly created and the data serialized, all that's left is +building the transaction. This is similar to what you've done in previous lessons. The example below assumes that: - `player`, `playerInfoAccount`, and `PROGRAM_ID` are already defined somewhere outside the code snippet -- `player` is a user’s public key +- `player` is a user's public key - `playerInfoAccount` is the public key of the account where inventory changes will be written - `SystemProgram` will be used in the process of executing the instruction. @@ -277,8 +277,8 @@ try { ## Lab -Let’s practice this together by building a Movie Review app that lets users -submit a movie review and have it stored on Solana’s network. We’ll build this +Let's practice this together by building a Movie Review app that lets users +submit a movie review and have it stored on Solana's network. We'll build this app a little bit at a time over the next few lessons, adding new functionality each lesson. @@ -288,7 +288,7 @@ Here's a quick diagram of the program we'll build: ![Solana stores data items in PDAs, which can be found using their seeds](/public/assets/courses/unboxed/movie-review-program.svg) -The public key of the Solana program we’ll use for this application is +The public key of the Solana program we'll use for this application is `CenYq6bDRB7p73EjsPEpiYN7uveyPUTdXkDkgUduboaN`. #### 1. Download the starter code @@ -303,8 +303,8 @@ list, a `Form` component for submitting a new review, and a `Movie.ts` file that contains a class definition for a `Movie` object. Note that for now, the movies displayed on the page when you run `npm run dev` -are mocks. In this lesson, we’ll focus on adding a new review but we won’t be -able to see that review displayed. Next lesson, we’ll focus on deserializing +are mocks. In this lesson, we'll focus on adding a new review but we won't be +able to see that review displayed. Next lesson, we'll focus on deserializing custom data from onchain accounts. #### 2. Create the buffer layout @@ -322,7 +322,7 @@ data to contain: 4. `description` as a string representing the written portion of the review you are leaving for the movie. -Let’s configure a `borsh` layout in the `Movie` class. Start by importing +Let's configure a `borsh` layout in the `Movie` class. Start by importing `@coral-xyz/borsh`. Next, create a `borshInstructionSchema` property and set it to the appropriate `borsh` struct containing the properties listed above. @@ -350,8 +350,8 @@ how the program is structured, the transaction will fail. #### 3. Create a method to serialize data -Now that we have the buffer layout set up, let’s create a method in `Movie` -called `serialize()` that will return a `Buffer` with a `Movie` object’s +Now that we have the buffer layout set up, let's create a method in `Movie` +called `serialize()` that will return a `Buffer` with a `Movie` object's properties encoded into the appropriate layout. Instead of allocating a fixed buffer size, we'll calculate the size dynamically @@ -437,7 +437,7 @@ send the transaction when a user submits the form. Open `Form.tsx` and locate the `handleTransactionSubmit` function. This gets called by `handleSubmit` each time a user submits the Movie Review form. -Inside this function, we’ll be creating and sending the transaction that +Inside this function, we'll be creating and sending the transaction that contains the data submitted through the form. Start by importing `@solana/web3.js` and importing `useConnection` and @@ -495,7 +495,7 @@ export const Form: FC = () => { } ``` -Before we implement `handleTransactionSubmit`, let’s talk about what needs to be +Before we implement `handleTransactionSubmit`, let's talk about what needs to be done. We need to: 1. Check that `publicKey` exists to ensure that the user has connected their @@ -506,12 +506,12 @@ done. We need to: 4. Get all of the accounts that the transaction will read or write. 5. Create a new `Instruction` object that includes all of these accounts in the `keys` argument, includes the buffer in the `data` argument, and includes the - program’s public key in the `programId` argument. + program's public key in the `programId` argument. 6. Add the instruction from the last step to the transaction. 7. Call `sendTransaction`, passing in the assembled transaction. -That’s quite a lot to process! But don’t worry, it gets easier the more you do -it. Let’s start with the first 3 steps from above: +That's quite a lot to process! But don't worry, it gets easier the more you do +it. Let's start with the first 3 steps from above: ```typescript const handleTransactionSubmit = async (movie: Movie) => { @@ -527,8 +527,8 @@ const handleTransactionSubmit = async (movie: Movie) => { The next step is to get all of the accounts that the transaction will read or write. In past lessons, the account where data will be stored has been given to -you. This time, the account’s address is more dynamic, so it needs to be -computed. We’ll cover this in-depth in the next lesson, but for now, you can use +you. This time, the account's address is more dynamic, so it needs to be +computed. We'll cover this in-depth in the next lesson, but for now, you can use the following, where `pda` is the address to the account where data will be stored: @@ -598,9 +598,9 @@ const handleTransactionSubmit = async (movie: Movie) => { }; ``` -And that’s it! You should now be able to use the form on the site to submit a -movie review. While you won’t see the UI update to reflect the new review, you -can look at the transaction’s program logs on Solana Explorer to see that it was +And that's it! You should now be able to use the form on the site to submit a +movie review. While you won't see the UI update to reflect the new review, you +can look at the transaction's program logs on Solana Explorer to see that it was successful. If you need a bit more time with this project to feel comfortable, have a look @@ -609,7 +609,7 @@ at the complete ## Challenge -Now it’s your turn to build something independently. Create an application that +Now it's your turn to build something independently. Create an application that lets students of this course introduce themselves! The Solana program that supports this is at `HdE95RSVsdb315jfJtaykXhXY478h53X6okDupVfY9yf`. diff --git a/content/courses/onchain-development/anchor-cpi.md b/content/courses/onchain-development/anchor-cpi.md index 7202f7c16..5c96a7456 100644 --- a/content/courses/onchain-development/anchor-cpi.md +++ b/content/courses/onchain-development/anchor-cpi.md @@ -330,10 +330,10 @@ pub enum MyError { ## Lab -Let’s practice the concepts we’ve gone over in this lesson by building on top of +Let's practice the concepts we've gone over in this lesson by building on top of the Movie Review program from previous lessons. -In this lab we’ll update the program to mint tokens to users when they submit a +In this lab we'll update the program to mint tokens to users when they submit a new movie review. @@ -342,7 +342,7 @@ new movie review. To get started, we will be using the final state of the Anchor Movie Review program from the previous lesson. So, if you just completed that lesson then -you’re all set and ready to go. If you are just jumping in here, no worries, you +you're all set and ready to go. If you are just jumping in here, no worries, you can [download the starter code](https://github.com/Unboxed-Software/anchor-movie-review-program/tree/solution-pdas). We'll be using the `solution-pdas` branch as our starting point. @@ -410,7 +410,7 @@ pub fn initialize_token_mint(_ctx: Context) -> Result<()> { ### Anchor Error -Next, let’s create an Anchor Error that we’ll use to validate the following: +Next, let's create an Anchor Error that we'll use to validate the following: - The `rating` passed to either the `add_movie_review` or `update_movie_review` instruction. @@ -432,7 +432,7 @@ enum MovieReviewError { ### Update add_movie_review instruction -Now that we've done some setup, let’s update the `add_movie_review` instruction +Now that we've done some setup, let's update the `add_movie_review` instruction and `AddMovieReview` context type to mint tokens to the reviewer. Next, update the `AddMovieReview` context type to add the following accounts: @@ -485,7 +485,7 @@ been initialized, it will be initialized as an associated token account for the specified mint and authority. Also, the payer for the costs related with the account initialization will be set under the constraint `payer`. -Next, let’s update the `add_movie_review` instruction to do the following: +Next, let's update the `add_movie_review` instruction to do the following: - Check that `rating` is valid. If it is not a valid rating, return the `InvalidRating` error. @@ -493,7 +493,7 @@ Next, let’s update the `add_movie_review` instruction to do the following: `TitleTooLong` error. - Check that `description` length is valid. If it is not a valid length, return the `DescriptionTooLong` error. -- Make a CPI to the token program’s `mint_to` instruction using the mint +- Make a CPI to the token program's `mint_to` instruction using the mint authority PDA as a signer. Note that we'll mint 10 tokens to the user but need to adjust for the mint decimals by making it `10*10^6`. @@ -608,7 +608,7 @@ pub fn update_movie_review( ### Test -Those are all of the changes we need to make to the program! Now, let’s update +Those are all of the changes we need to make to the program! Now, let's update our tests. Start by making sure your imports and `describe` function look like this: diff --git a/content/courses/onchain-development/anchor-pdas.md b/content/courses/onchain-development/anchor-pdas.md index 9f7d80ecb..b2a55c9e3 100644 --- a/content/courses/onchain-development/anchor-pdas.md +++ b/content/courses/onchain-development/anchor-pdas.md @@ -235,7 +235,7 @@ To use `init_if_needed`, you must first enable the feature in `Cargo.toml`. anchor-lang = { version = "0.30.1", features = ["init-if-needed"] } ``` -Once you’ve enabled the feature, you can include the constraint in the +Once you've enabled the feature, you can include the constraint in the `#[account(…)]` attribute macro. The example below demonstrates using the `init_if_needed` constraint to initialize a new associated token account if one does not already exist. @@ -348,7 +348,7 @@ The `close` constraint provides a simple and secure way to close an existing account. The `close` constraint marks the account as closed at the end of the -instruction’s execution by setting its discriminator to a _special value_ called +instruction's execution by setting its discriminator to a _special value_ called `CLOSED_ACCOUNT_DISCRIMINATOR` and sends its lamports to a specified account. This _special value_ prevents the account from being reopened because any attempt to reinitialize the account will fail the discriminator check. @@ -372,7 +372,7 @@ pub struct Close<'info> { ## Lab -Let’s practice the concepts we’ve gone over in this lesson by creating a Movie +Let's practice the concepts we've gone over in this lesson by creating a Movie Review program using the Anchor framework. This program will allow users to: @@ -385,7 +385,7 @@ This program will allow users to: ### Create a new Anchor project -To begin, let’s create a new project using `anchor init`. +To begin, let's create a new project using `anchor init`. ```bash anchor init anchor-movie-review-program @@ -428,14 +428,14 @@ pub mod anchor_movie_review_program { ### MovieAccountState -First, let’s use the `#[account]` attribute macro to define the +First, let's use the `#[account]` attribute macro to define the `MovieAccountState` that will represent the data structure of the movie review accounts. As a reminder, the `#[account]` attribute macro implements various traits that help with serialization and deserialization of the account, set the discriminator for the account, and set the owner of a new account as the program ID defined in the `declare_id!` macro. -Within each movie review account, we’ll store the: +Within each movie review account, we'll store the: - `reviewer` - user creating the review - `rating` - rating for the movie @@ -504,8 +504,8 @@ more detail in the next chapter. ### Add Movie Review -Next, let’s implement the `add_movie_review` instruction. The `add_movie_review` -instruction will require a `Context` of type `AddMovieReview` that we’ll +Next, let's implement the `add_movie_review` instruction. The `add_movie_review` +instruction will require a `Context` of type `AddMovieReview` that we'll implement shortly. The instruction will require three additional arguments as instruction data @@ -515,8 +515,8 @@ provided by a reviewer: - `description` - details of the review as a `String` - `rating` - rating for the movie as a `u8` -Within the instruction logic, we’ll populate the data of the new `movie_review` -account with the instruction data. We’ll also set the `reviewer` field as the +Within the instruction logic, we'll populate the data of the new `movie_review` +account with the instruction data. We'll also set the `reviewer` field as the `initializer` account from the instruction context. We will also perform some checks, using the `require!` macro, to make sure that: @@ -568,7 +568,7 @@ pub mod anchor_movie_review_program{ } ``` -Next, let’s create the `AddMovieReview` struct that we used as the generic in +Next, let's create the `AddMovieReview` struct that we used as the generic in the instruction's context. This struct will list the accounts the `add_movie_review` instruction requires. @@ -610,7 +610,7 @@ pub struct AddMovieReview<'info> { ### Update Movie Review -Next, let’s implement the `update_movie_review` instruction with a context whose +Next, let's implement the `update_movie_review` instruction with a context whose generic type is `UpdateMovieReview`. Just as before, the instruction will require three additional arguments as @@ -620,7 +620,7 @@ instruction data provided by a reviewer: - `description` - details of the review - `rating` - rating for the movie -Within the instruction logic we’ll update the `rating` and `description` stored +Within the instruction logic we'll update the `rating` and `description` stored on the `movie_review` account. While the `title` doesn't get used in the instruction function itself, we'll @@ -654,7 +654,7 @@ pub mod anchor_movie_review_program { } ``` -Next, let’s create the `UpdateMovieReview` struct to define the accounts that +Next, let's create the `UpdateMovieReview` struct to define the accounts that the `update_movie_review` instruction needs. Since the `movie_review` account will have already been initialized by this @@ -698,7 +698,7 @@ expanding the space allocated to the account. ### Delete Movie Review -Lastly, let’s implement the `delete_movie_review` instruction to close an +Lastly, let's implement the `delete_movie_review` instruction to close an existing `movie_review` account. We'll use a context whose generic type is `DeleteMovieReview` and won't include @@ -722,7 +722,7 @@ pub mod anchor_movie_review_program { } ``` -Next, let’s implement the `DeleteMovieReview` struct. +Next, let's implement the `DeleteMovieReview` struct. ```rust #[derive(Accounts)] @@ -869,7 +869,7 @@ continuing. ## Challenge -Now it’s your turn to build something independently. Equipped with the concepts +Now it's your turn to build something independently. Equipped with the concepts introduced in this lesson, try to recreate the Student Intro program that we've used before using the Anchor framework. diff --git a/content/courses/onchain-development/intro-to-anchor-frontend.md b/content/courses/onchain-development/intro-to-anchor-frontend.md index 78443f151..7d2a419bb 100644 --- a/content/courses/onchain-development/intro-to-anchor-frontend.md +++ b/content/courses/onchain-development/intro-to-anchor-frontend.md @@ -16,7 +16,7 @@ description: - An **IDL** is a file representing the structure of a Solana program. Programs written and built using Anchor automatically generate a corresponding IDL. IDL stands for Interface Description Language. -- `@coral-xyz/anchor` is a Typescript client that includes everything you’ll +- `@coral-xyz/anchor` is a Typescript client that includes everything you'll need to interact with Anchor programs - An **Anchor `Provider`** object combines a `connection` to a cluster and a specified `wallet` to enable transaction signing @@ -288,7 +288,7 @@ The `Provider` object combines two things: - `Wallet` - a specified address used to pay for and sign transactions The `Provider` is then able to send transactions to the Solana blockchain on -behalf of a `Wallet` by including the wallet’s signature to outgoing +behalf of a `Wallet` by including the wallet's signature to outgoing transactions. When using a frontend with a Solana wallet provider, all outgoing transactions must still be approved by the user via their wallet browser extension. @@ -361,7 +361,7 @@ The `AnchorProvider` constructor takes three parameters: - `opts` - optional parameter that specifies the confirmation options, using a default setting if one is not provided -Once you’ve created the `Provider` object, you then set it as the default +Once you've created the `Provider` object, you then set it as the default provider using `setProvider`. ```typescript @@ -559,7 +559,7 @@ const accounts = await program.account.counter.fetchMultiple([ ## Lab -Let’s practice this together by building a frontend for the Counter program from +Let's practice this together by building a frontend for the Counter program from last lesson. As a reminder, the Counter program has two instructions: - `initialize` - initializes a new `Counter` account and sets the `count` to `0` @@ -576,14 +576,14 @@ This project is a simple Next.js application, created using `npx create-next-dapp` The `idl.json` file for the Counter program, and the `Initialize` and -`Increment` components we’ll be building throughout this lab. +`Increment` components we'll be building throughout this lab. #### 2. `Initialize` -To begin, let’s complete the setup to create the `useCounterProgram` hook in +To begin, let's complete the setup to create the `useCounterProgram` hook in `components/counter/counter-data-access.tsx` component. -Remember, we’ll need an instance of `Program` to use the Anchor `MethodsBuilder` +Remember, we'll need an instance of `Program` to use the Anchor `MethodsBuilder` to invoke the instructions on our program. `create-solana-dapp` already creates a `getCounterProgram` for us, which will return us the `Program` instance. @@ -607,7 +607,7 @@ const program = getCounterProgram(provider); Now that we've the program instance, we can actually invoke the program's `initialize` instruction. We'll do this using `useMutation`. -Remember, We’ll need to generate a new `Keypair` for the new `Counter` account +Remember, We'll need to generate a new `Keypair` for the new `Counter` account since we are initializing an account for the first time. ```typescript @@ -658,7 +658,7 @@ created. This method internally calls, `getProgramAccounts`. #### 4. `Increment` -Next, let’s move on the the `useCounterProgramAccount` hook. As we have earlier +Next, let's move on the the `useCounterProgramAccount` hook. As we have earlier already created `program` and `accounts` function in previous hook, we'll call the hooks to access them and not redefine them. @@ -673,7 +673,7 @@ export function useCounterProgramAccount({ account }: { account: PublicKey }) { ``` -Next, let’s use the Anchor `MethodsBuilder` to build a new instruction to invoke +Next, let's use the Anchor `MethodsBuilder` to build a new instruction to invoke the `increment` instruction. Again, Anchor can infer the `user` account from the wallet so we only need to include the `counter` account. @@ -732,11 +732,11 @@ continuing. ## Challenge -Now it’s your turn to build something independently. Building on top of what -we’ve done in the lab, try to create a new component in the frontend that +Now it's your turn to build something independently. Building on top of what +we've done in the lab, try to create a new component in the frontend that implements a button to decrements the counter. -Before building the component in the frontend, you’ll first need to: +Before building the component in the frontend, you'll first need to: 1. Build and deploy a new program that implements a `decrement` instruction 2. Update the IDL file in the frontend with the one from your new program diff --git a/content/courses/onchain-development/intro-to-anchor.md b/content/courses/onchain-development/intro-to-anchor.md index 2d3305a34..efda0cd27 100644 --- a/content/courses/onchain-development/intro-to-anchor.md +++ b/content/courses/onchain-development/intro-to-anchor.md @@ -42,9 +42,9 @@ Anchor uses macros and traits to generate boilerplate Rust code for you. These provide a clear structure to your program so you can more easily reason about your code. The main high-level macros and attributes are: -- `declare_id` - a macro for declaring the program’s onchain address +- `declare_id` - a macro for declaring the program's onchain address - `#[program]` - an attribute macro used to denote the module containing the - program’s instruction logic + program's instruction logic - `Accounts` - a trait applied to structs representing the list of accounts required for an instruction - `#[account]` - an attribute macro used to define custom account types for the @@ -183,7 +183,7 @@ You may have noticed in the previous example that one of the accounts in was of type `Program`. Anchor provides a number of account types that can be used to represent -accounts. Each type implements different account validation. We’ll go over a few +accounts. Each type implements different account validation. We'll go over a few of the common types you may encounter, but be sure to look through the [full list of account types](https://docs.rs/anchor-lang/latest/anchor_lang/accounts/index.html). @@ -329,8 +329,8 @@ from the first 8 bytes of the SHA256 hash of the account type's name. The first 8 bytes are reserved for the account discriminator when implementing account serialization traits (which is almost always in an Anchor program). -As a result, any calls to `AccountDeserialize`’s `try_deserialize` will check -this discriminator. If it doesn’t match, an invalid account was given, and the +As a result, any calls to `AccountDeserialize`'s `try_deserialize` will check +this discriminator. If it doesn't match, an invalid account was given, and the account deserialization will exit with an error. The `#[account]` attribute also implements the `Owner` trait for a struct using @@ -500,7 +500,7 @@ pub struct Counter { #### 3. Implement `Context` type `Initialize` -Next, using the `#[derive(Accounts)]` macro, let’s implement the `Initialize` +Next, using the `#[derive(Accounts)]` macro, let's implement the `Initialize` type that lists and validates the accounts used by the `initialize` instruction. It'll need the following accounts: @@ -522,10 +522,10 @@ pub struct Initialize<'info> { #### 4. Add the `initialize` instruction -Now that we have our `Counter` account and `Initialize` type , let’s implement +Now that we have our `Counter` account and `Initialize` type , let's implement the `initialize` instruction within `#[program]`. This instruction requires a `Context` of type `Initialize` and takes no additional instruction data. In the -instruction logic, we are simply setting the `counter` account’s `count` field +instruction logic, we are simply setting the `counter` account's `count` field to `0`. ```rust @@ -540,14 +540,14 @@ pub fn initialize(ctx: Context) -> Result<()> { #### 5. Implement `Context` type `Update` -Now, using the `#[derive(Accounts)]` macro again, let’s create the `Update` type +Now, using the `#[derive(Accounts)]` macro again, let's create the `Update` type that lists the accounts that the `increment` instruction requires. It'll need the following accounts: - `counter` - an existing counter account to increment - `user` - payer for the transaction fee -Again, we’ll need to specify any constraints using the `#[account(..)]` +Again, we'll need to specify any constraints using the `#[account(..)]` attribute: ```rust @@ -561,11 +561,11 @@ pub struct Update<'info> { #### 6. Add `increment` instruction -Lastly, within `#[program]`, let’s implement an `increment` instruction to +Lastly, within `#[program]`, let's implement an `increment` instruction to increment the `count` once a `counter` account is initialized by the first instruction. This instruction requires a `Context` of type `Update` (implemented in the next step) and takes no additional instruction data. In the instruction -logic, we are simply incrementing an existing `counter` account’s `count` field +logic, we are simply incrementing an existing `counter` account's `count` field by `1`. ```rust @@ -713,7 +713,7 @@ if you need some more time with it. ## Challenge -Now it’s your turn to build something independently. Because we're starting with +Now it's your turn to build something independently. Because we're starting with simple programs, yours will look almost identical to what we just created. It's useful to try and get to the point where you can write it from scratch without referencing prior code, so try not to copy and paste here. diff --git a/content/courses/program-optimization/program-architecture.md b/content/courses/program-optimization/program-architecture.md index f18289284..52056d9a7 100644 --- a/content/courses/program-optimization/program-architecture.md +++ b/content/courses/program-optimization/program-architecture.md @@ -31,7 +31,7 @@ with the code. And you, as the designer, need to think about: These questions are even more important when developing for a blockchain. Not only are resources more limited than in a typical computing environment, you're -also dealing with people’s assets; code has a cost now. +also dealing with people's assets; code has a cost now. We'll leave most of the asset handling discussion to [security course lesson](/content/courses/program-security/security-intro), but @@ -46,10 +46,10 @@ considerations that should be taken when creating Solana programs. ### Dealing With Large Accounts -In modern application programming, we don’t often have to think about the size +In modern application programming, we don't often have to think about the size of the data structures we are using. You want to make a string? You can put a 4000 character limit on it if you want to avoid abuse, but it's probably not an -issue. Want an integer? They’re pretty much always 32-bit for convenience. +issue. Want an integer? They're pretty much always 32-bit for convenience. In high level languages, you are in the data-land-o-plenty! Now, in Solana land, we pay per byte stored (rent) and have limits on heap, stack and account sizes. @@ -63,7 +63,7 @@ we are going to be looking at in this section: 2. When operating on larger data, we run into [Stack](https://solana.com/docs/onchain-programs/faq#stack) and [Heap](https://solana.com/docs/onchain-programs/faq#heap-size) constraints - - to get around these, we’ll look at using Box and Zero-Copy. + to get around these, we'll look at using Box and Zero-Copy. #### Sizes @@ -77,10 +77,10 @@ used to be an actual thing, but now there's an enforced minimum rent exemption. You can read about it in [the Solana documentation](https://solana.com/docs/intro/rent). -Rent etymology aside, putting data on the blockchain can be expensive. It’s why +Rent etymology aside, putting data on the blockchain can be expensive. It's why NFT attributes and associated files, like the image, are stored offchain. You ultimately want to strike a balance that leaves your program highly functional -without becoming so expensive that your users don’t want to pay to open the data +without becoming so expensive that your users don't want to pay to open the data account. The first thing you need to know before you can start optimizing for space in @@ -109,7 +109,7 @@ from the Knowing these, start thinking about little optimizations you might take in a program. For example, if you have an integer field that will only ever reach -100, don’t use a u64/i64, use a u8. Why? Because a u64 takes up 8 bytes, with a +100, don't use a u64/i64, use a u8. Why? Because a u64 takes up 8 bytes, with a max value of 2^64 or 1.84 \* 10^19. Thats a waste of space since you only need to accommodate numbers up to 100. A single byte will give you a max value of 255 which, in this case, would be sufficient. Similarly, there's no reason to use i8 @@ -127,8 +127,8 @@ If you want to read more about Anchor sizes, take a look at #### Box -Now that you know a little bit about data sizes, let’s skip forward and look at -a problem you’ll run into if you want to deal with larger data accounts. Say you +Now that you know a little bit about data sizes, let's skip forward and look at +a problem you'll run into if you want to deal with larger data accounts. Say you have the following data account: ```rust @@ -144,7 +144,7 @@ pub struct SomeFunctionContext<'info> { ``` If you try to pass `SomeBigDataStruct` into the function with the -`SomeFunctionContext` context, you’ll run into the following compiler warning: +`SomeFunctionContext` context, you'll run into the following compiler warning: `// Stack offset of XXXX exceeded max offset of 4096 by XXXX bytes, please minimize large stack variables` @@ -174,7 +174,7 @@ pub struct SomeFunctionContext<'info> { In Anchor, **`Box`** is used to allocate the account to the Heap, not the Stack. Which is great since the Heap gives us 32KB to work with. The best part -is you don’t have to do anything different within the function. All you need to +is you don't have to do anything different within the function. All you need to do is add `Box<…>` around all of your big data accounts. But Box is not perfect. You can still overflow the stack with sufficiently large @@ -224,7 +224,7 @@ To understand what's happening here, take a look at the > heap size. When using borsh, the account has to be copied and deserialized > into a new data structure and thus is constrained by stack and heap limits > imposed by the BPF VM. With zero copy deserialization, all bytes from the -> account’s backing `RefCell<&mut [u8]>` are simply re-interpreted as a +> account's backing `RefCell<&mut [u8]>` are simply re-interpreted as a > reference to the data structure. No allocations or copies necessary. Hence the > ability to get around stack and heap limitations. @@ -244,7 +244,7 @@ pub struct ConceptZeroCopy<'info> { } ``` -Instead, your client has to create the large account and pay for it’s rent in a +Instead, your client has to create the large account and pay for it's rent in a separate instruction. ```typescript @@ -275,12 +275,12 @@ The second caveat is that your'll have to call one of the following methods from inside your rust instruction function to load the account: - `load_init` when first initializing an account (this will ignore the missing - account discriminator that gets added only after the user’s instruction code) + account discriminator that gets added only after the user's instruction code) - `load` when the account is not mutable - `load_mut` when the account is mutable For example, if you wanted to init and manipulate the `SomeReallyBigDataStruct` -from above, you’d call the following in the function +from above, you'd call the following in the function ```rust let some_really_big_data = &mut ctx.accounts.some_really_big_data.load_init()?; @@ -296,7 +296,7 @@ Box and Zero-Copy in vanilla Solana. ### Dealing with Accounts -Now that you know the nuts and bolts of space consideration on Solana, let’s +Now that you know the nuts and bolts of space consideration on Solana, let's look at some higher level considerations. In Solana, everything is an account, so for the next couple sections we'll look at some account architecture concepts. @@ -320,7 +320,7 @@ the location of `id` on the memory map. To make this more clear, observe what this account's data looks like onchain when `flags` has four items in the vector vs eight items. If you were to call -`solana account ACCOUNT_KEY` you’d get a data dump like the following: +`solana account ACCOUNT_KEY` you'd get a data dump like the following: ```rust 0000: 74 e4 28 4e d9 ec 31 0a -> Account Discriminator (8) @@ -367,7 +367,7 @@ const states = await program.account.badState.all([ However, if you wanted to query by the `id`, you wouldn't know what to put for the `offset` since the location of `id` is variable based on the length of -`flags`. That doesn’t seem very helpful. IDs are usually there to help with +`flags`. That doesn't seem very helpful. IDs are usually there to help with queries! The simple fix is to flip the order. ```rust @@ -453,7 +453,7 @@ add in some `for_future_use` bytes. #### Data Optimization The idea here is to be aware of wasted bits. For example, if you have a field -that represents the month of the year, don’t use a `u64`. There will only ever +that represents the month of the year, don't use a `u64`. There will only ever be 12 months. Use a `u8`. Better yet, use a `u8` Enum and label the months. To get even more aggressive on bit savings, be careful with booleans. Look at @@ -536,10 +536,10 @@ Depending on the seeding you can create all sorts of relationships: program. For example, if your program needs a lookup table, you could seed it with `seeds=[b"Lookup"]`. Just be careful to provide appropriate access restrictions. -- One-Per-Owner - Say you’re creating a video game player account and you only - want one player account per wallet. Then you’d seed the account with - `seeds=[b"PLAYER", owner.key().as_ref()]`. This way, you’ll always know where - to look for a wallet’s player account **and** there can only ever be one of +- One-Per-Owner - Say you're creating a video game player account and you only + want one player account per wallet. Then you'd seed the account with + `seeds=[b"PLAYER", owner.key().as_ref()]`. This way, you'll always know where + to look for a wallet's player account **and** there can only ever be one of them. - Multiple-Per-Owner - Okay, but what if you want multiple accounts per wallet? Say you want to mint podcast episodes. Then you could seed your `Podcast` @@ -555,8 +555,8 @@ From there you can mix and match in all sorts of clever ways! But the preceding list should give you enough to get started. The big benefit of really paying attention to this aspect of design is answering -the ‘indexing’ problem. Without PDAs and seeds, all users would have to keep -track of all of the addresses of all of the accounts they’ve ever used. This +the ‘indexing' problem. Without PDAs and seeds, all users would have to keep +track of all of the addresses of all of the accounts they've ever used. This isn't feasible for users, so they'd have to depend on a centralized entity to store their addresses in a database. In many ways that defeats the purpose of a globally distributed network. PDAs are a much better solution. @@ -604,11 +604,11 @@ you can avoid concurrency issues and really boost your program's performance. #### Shared Accounts -If you’ve been around crypto for a while, you may have experienced a big NFT +If you've been around crypto for a while, you may have experienced a big NFT mint event. A new NFT project is coming out, everyone is really excited for it, -and then the candymachine goes live. It’s a mad dash to click +and then the candymachine goes live. It's a mad dash to click `accept transaction` as fast as you can. If you were clever, you may have -written a bot to enter in the transactions faster that the website’s UI could. +written a bot to enter in the transactions faster that the website's UI could. This mad rush to mint creates a lot of failed transactions. But why? Because everyone is trying to write data to the same Candy Machine account. @@ -634,7 +634,7 @@ Bob -- pays --- | ``` Since both of these transactions write to Carol's token account, only one of -them can go through at a time. Fortunately, Solana is wicked fast, so it’ll +them can go through at a time. Fortunately, Solana is wicked fast, so it'll probably seem like they get paid at the same time. But what happens if more than just Alice and Bob try to pay Carol? @@ -658,7 +658,7 @@ trying to write data to the same account all at once. Imagine you create a super popular program and you want to take a fee on every transaction you process. For accounting reasons, you want all of those fees to go to one wallet. With that setup, on a surge of users, your protocol will -become slow and or become unreliable. Not great. So what’s the solution? +become slow and or become unreliable. Not great. So what's the solution? Separate the data transaction from the fee transaction. For example, imagine you have a data account called `DonationTally`. Its only @@ -675,7 +675,7 @@ pub struct DonationTally { } ``` -First let’s look at the suboptimal solution. +First let's look at the suboptimal solution. ```rust pub fn run_concept_shared_account_bottleneck(ctx: Context, lamports_to_donate: u64) -> Result<()> { @@ -707,7 +707,7 @@ pub fn run_concept_shared_account_bottleneck(ctx: Context) -> Result<()> { #### 8. Attack Monster -Now! Let’s attack those monsters and start gaining some exp! +Now! Let's attack those monsters and start gaining some exp! The logic here is as follows: @@ -1273,10 +1273,10 @@ incrementing experience and kill counts. The `saturating_add` function ensures the number will never overflow. Say the `kills` was a u8 and my current kill count was 255 (0xFF). If I killed another and added normally, e.g. `255 + 1 = 0 (0xFF + 0x01 = 0x00) = 0`, the kill count -would end up as 0. `saturating_add` will keep it at its max if it’s about to +would end up as 0. `saturating_add` will keep it at its max if it's about to roll over, so `255 + 1 = 255`. The `checked_add` function will throw an error if -it’s about to overflow. Keep this in mind when doing math in Rust. Even though -`kills` is a u64 and will never roll with it’s current programming, it’s good +it's about to overflow. Keep this in mind when doing math in Rust. Even though +`kills` is a u64 and will never roll with it's current programming, it's good practice to use safe math and consider roll-overs. ```rust @@ -1452,9 +1452,9 @@ anchor build #### Testing -Now, let’s see this baby work! +Now, let's see this baby work! -Let’s set up the `tests/rpg.ts` file. We will be filling out each test in turn. +Let's set up the `tests/rpg.ts` file. We will be filling out each test in turn. But first, we needed to set up a couple of different accounts. Mainly the `gameMaster` and the `treasury`. @@ -1531,7 +1531,7 @@ anchor test some `.pnp.*` files and no `node_modules`, you may want to call `rm -rf .pnp.*` followed by `npm i` and then `yarn install`. That should work. -Now that everything is running, let’s implement the `Create Player`, +Now that everything is running, let's implement the `Create Player`, `Spawn Monster`, and `Attack Monster` tests. Run each test as you complete them to make sure things are running smoothly. @@ -1753,7 +1753,7 @@ optimization adds up! ## Challenge -Now it’s your turn to practice independently. Go back through the lab code +Now it's your turn to practice independently. Go back through the lab code looking for additional optimizations and/or expansion you can make. Think through new systems and features you would add and how you would optimize them. diff --git a/content/courses/program-optimization/program-configuration.md b/content/courses/program-optimization/program-configuration.md index ac844a33c..58760c4b0 100644 --- a/content/courses/program-optimization/program-configuration.md +++ b/content/courses/program-optimization/program-configuration.md @@ -27,7 +27,7 @@ description: [`cfg!` **macro**](https://doc.rust-lang.org/std/macro.cfg.html) to compile different code paths based on the enabled features. - For environment-like variables post-deployment, create program accounts and - admin-only instructions accessible by the program’s upgrade authority. + admin-only instructions accessible by the program's upgrade authority. ## Lesson @@ -59,7 +59,7 @@ effective solution involves a combination of two techniques: ### Native Rust Feature Flags One of the simplest ways to create environments is to use Rust features. -Features are defined in the `[features]` table of the program’s `Cargo.toml` +Features are defined in the `[features]` table of the program's `Cargo.toml` file. You may define multiple features for different use cases. ```toml diff --git a/content/courses/program-optimization/rust-macros.md b/content/courses/program-optimization/rust-macros.md index 37dd55142..6f9e106a1 100644 --- a/content/courses/program-optimization/rust-macros.md +++ b/content/courses/program-optimization/rust-macros.md @@ -798,9 +798,9 @@ pub struct Config { ### 3. Define the custom macro Now, in the `custom-macro/src/lib.rs` file, let's add our new macro's -declaration. In this file, we’ll use the `parse_macro_input!` macro to parse the +declaration. In this file, we'll use the `parse_macro_input!` macro to parse the input `TokenStream` and extract the `ident` and `data` fields from a -`DeriveInput` struct. Then, we’ll use the `eprintln!` macro to print the values +`DeriveInput` struct. Then, we'll use the `eprintln!` macro to print the values of `ident` and `data`. We will now use `TokenStream::new()` to return an empty `TokenStream`. @@ -834,7 +834,7 @@ input `TokenStream` parses correctly, remove the `eprintln!` statements. ### 4. Get the struct's fields -Next, let’s use `match` statements to get the named fields from the `data` of +Next, let's use `match` statements to get the named fields from the `data` of the struct. Then we'll use the `eprintln!` macro to print the values of the fields. @@ -867,7 +867,7 @@ correctly, you can remove the `eprintln!` statement. ### 5. Build update instructions -Next, let’s iterate over the fields of the struct and generate an update +Next, let's iterate over the fields of the struct and generate an update instruction for each field. The instruction will be generated using the `quote!` macro, including the field's name and type and a new function name for the update instruction. @@ -909,7 +909,7 @@ pub fn instruction_builder(input: TokenStream) -> TokenStream { ### 6. Return new `TokenStream` -Lastly, let’s use the `quote!` macro to generate an implementation for the +Lastly, let's use the `quote!` macro to generate an implementation for the struct with the name specified by the `ident` variable. The implementation includes the update instructions generated for each field in the struct. The generated code is then converted to a `TokenStream` using the `into()` method diff --git a/content/courses/program-security/account-data-matching.md b/content/courses/program-security/account-data-matching.md index f9963a123..baff3b5bf 100644 --- a/content/courses/program-security/account-data-matching.md +++ b/content/courses/program-security/account-data-matching.md @@ -80,7 +80,7 @@ pub struct AdminConfig { The basic Rust approach to solve this problem is to simply compare the passed in `admin` key to the `admin` key stored in the `admin_config` account, throwing an -error if they don’t match. +error if they don't match. ```rust if ctx.accounts.admin.key() != ctx.accounts.admin_config.admin { @@ -172,7 +172,7 @@ pub struct AdminConfig { Alternatively, you can use `constraint` to manually add an expression that must evaluate to true in order for execution to continue. This is useful when for -some reason naming can’t be consistent or when you need a more complex +some reason naming can't be consistent or when you need a more complex expression to fully validate the incoming data. ```rust @@ -191,9 +191,9 @@ pub struct UpdateAdmin<'info> { ## Lab -For this lab we’ll create a simple “vault” program similar to the program we +For this lab we'll create a simple “vault” program similar to the program we used in the Signer Authorization lesson and the Owner Check lesson. Similar to -those labs, we’ll show in this lab how a missing data validation check could +those labs, we'll show in this lab how a missing data validation check could allow the vault to be drained. #### 1. Starter @@ -212,7 +212,7 @@ program. This allows the `vault` account to sign for the transfer of tokens from the token account. The `insecure_withdraw` instruction transfers all the tokens in the `vault` -account’s token account to a `withdraw_destination` token account. +account's token account to a `withdraw_destination` token account. Notice that this instruction \***\*does\*\*** have a signer check for `authority` and an owner check for `vault`. However, nowhere in the account @@ -315,8 +315,8 @@ pub struct Vault { #### 2. Test `insecure_withdraw` instruction -To prove that this is a problem, let’s write a test where an account other than -the vault’s `authority` tries to withdraw from the vault. +To prove that this is a problem, let's write a test where an account other than +the vault's `authority` tries to withdraw from the vault. The test file includes the code to invoke the `initialize_vault` instruction using the provider wallet as the `authority` and then mints 100 tokens to the @@ -363,14 +363,14 @@ account-data-matching #### 3. Add `secure_withdraw` instruction -Let’s go implement a secure version of this instruction called +Let's go implement a secure version of this instruction called `secure_withdraw`. This instruction will be identical to the `insecure_withdraw` instruction, -except we’ll use the `has_one` constraint in the account validation struct +except we'll use the `has_one` constraint in the account validation struct (`SecureWithdraw`) to check that the `authority` account passed into the instruction matches the `authority` account on the `vault` account. That way -only the correct authority account can withdraw the vault’s tokens. +only the correct authority account can withdraw the vault's tokens. ```rust use anchor_lang::prelude::*; @@ -429,7 +429,7 @@ pub struct SecureWithdraw<'info> { #### 4. Test `secure_withdraw` instruction -Now let’s test the `secure_withdraw` instruction with two tests: one that uses +Now let's test the `secure_withdraw` instruction with two tests: one that uses `walletFake` as the authority and one that uses `wallet` as the authority. We expect the first invocation to return an error and the second to succeed. diff --git a/content/courses/program-security/arbitrary-cpi.md b/content/courses/program-security/arbitrary-cpi.md index b5302e7d8..edb15793d 100644 --- a/content/courses/program-security/arbitrary-cpi.md +++ b/content/courses/program-security/arbitrary-cpi.md @@ -3,7 +3,7 @@ title: Arbitrary CPI objectives: - Explain the security risks associated with invoking a CPI to an unknown program - - Showcase how Anchor’s CPI module prevents this from happening when making a + - Showcase how Anchor's CPI module prevents this from happening when making a CPI from one Anchor program to another - Safely and securely make a CPI from an Anchor program to an arbitrary non-anchor program @@ -35,7 +35,7 @@ program results in your program performing CPIs to arbitrary programs. This lack of program checks creates an opportunity for a malicious user to pass in a different program than expected, causing the original program to call an -instruction handler on this mystery program. There’s no telling what the +instruction handler on this mystery program. There's no telling what the consequences of this CPI could be. It depends on the program logic (both that of the original program and the unexpected program), as well as what other accounts are passed into the original instruction handler. @@ -120,8 +120,8 @@ pub fn cpi_secure(ctx: Context, amount: u64) -> ProgramResult { Now, if an attacker passes in a different token program, the instruction handler will return the `ProgramError::IncorrectProgramId` error. -Depending on the program you’re invoking with your CPI, you can either hard code -the address of the expected program ID or use the program’s Rust crate to get +Depending on the program you're invoking with your CPI, you can either hard code +the address of the expected program ID or use the program's Rust crate to get the address of the program, if available. In the example above, the `spl_token` crate provides the address of the SPL Token Program. @@ -133,7 +133,7 @@ learned in a [previous lesson of Anchor CPI](/content/courses/onchain-development/anchor-cpi.md) that Anchor can automatically generate CPI modules to make CPIs into the program simpler. These modules also enhance security by verifying the public key of the -program that’s passed into one of its public instructions. +program that's passed into one of its public instructions. Every Anchor program uses the `declare_id()` macro to define the address of the program. When a CPI module is generated for a specific program, it uses the @@ -190,8 +190,8 @@ Like the example above, Anchor has created a few that allow you to issue CPIs into them as if they were Anchor programs. -Additionally and depending on the program you’re making the CPI to, you may be -able to use Anchor’s +Additionally and depending on the program you're making the CPI to, you may be +able to use Anchor's [`Program` account type](https://docs.rs/anchor-lang/latest/anchor_lang/accounts/program/struct.Program.html) to validate the passed-in program in your account validation struct. Between the [`anchor_lang`](https://docs.rs/anchor-lang/latest/anchor_lang) and [`anchor_spl`](https://docs.rs/anchor_spl/latest/) crates, diff --git a/content/courses/program-security/bump-seed-canonicalization.md b/content/courses/program-security/bump-seed-canonicalization.md index e87e288fe..e2ea4d93f 100644 --- a/content/courses/program-security/bump-seed-canonicalization.md +++ b/content/courses/program-security/bump-seed-canonicalization.md @@ -3,7 +3,7 @@ title: Bump Seed Canonicalization objectives: - Explain the vulnerabilities associated with using PDAs derived without the canonical bump - - Initialize a PDA using Anchor’s `seeds` and `bump` constraints to + - Initialize a PDA using Anchor's `seeds` and `bump` constraints to automatically use the canonical bump - Use Anchor's `seeds` and `bump` constraints to ensure the canonical bump is always used in future instructions when deriving a PDA @@ -30,7 +30,7 @@ description: - Anchor allows you to **specify a bump** with the `bump = ` constraint when verifying the address of a PDA - Because `find_program_address` can be expensive, best practice is to store the - derived bump in an account’s data field to be referenced later on when + derived bump in an account's data field to be referenced later on when re-deriving the address for verification ```rust #[derive(Accounts)] @@ -151,12 +151,12 @@ pub fn set_value_secure( } ``` -### Use Anchor’s `seeds` and `bump` constraints +### Use Anchor's `seeds` and `bump` constraints Anchor provides a convenient way to derive PDAs in the account validation struct using the `seeds` and `bump` constraints. These can even be combined with the `init` constraint to initialize the account at the intended address. To protect -the program from the vulnerability we’ve been discussing throughout this lesson, +the program from the vulnerability we've been discussing throughout this lesson, Anchor does not even allow you to initialize an account at a PDA using anything but the canonical bump. Instead, it uses `find_program_address` to derive the PDA and subsequently performs the initialization. @@ -280,7 +280,7 @@ If you don't specify the bump on the `bump` constraint, Anchor will still use `find_program_address` to derive the PDA using the canonical bump. As a consequence, your instruction will incur a variable amount of compute budget. Programs that are already at risk of exceeding their compute budget should use -this with care since there is a chance that the program’s budget may be +this with care since there is a chance that the program's budget may be occasionally and unpredictably exceeded. On the other hand, if you only need to verify the address of a PDA passed in diff --git a/content/courses/program-security/closing-accounts.md b/content/courses/program-security/closing-accounts.md index 2f62f0c9c..d4a9b28de 100644 --- a/content/courses/program-security/closing-accounts.md +++ b/content/courses/program-security/closing-accounts.md @@ -33,7 +33,7 @@ While it sounds simple, closing accounts properly can be tricky. There are a number of ways an attacker could circumvent having the account closed if you don't follow specific steps. -To get a better understanding of these attack vectors, let’s explore each of +To get a better understanding of these attack vectors, let's explore each of these scenarios in depth. ### Insecure account closing @@ -45,10 +45,10 @@ account. This resets the owner from the owning program to the system program. Take a look at the example below. The instruction requires two accounts: 1. `account_to_close` - the account to be closed -2. `destination` - the account that should receive the closed account’s lamports +2. `destination` - the account that should receive the closed account's lamports The program logic is intended to close an account by simply increasing the -`destination` account’s lamports by the amount stored in the `account_to_close` +`destination` account's lamports by the amount stored in the `account_to_close` and setting the `account_to_close` lamports to 0. With this program, after a full transaction is processed, the `account_to_close` will be garbage collected by the runtime. @@ -238,7 +238,7 @@ Fortunately, Anchor makes all of this much simpler with the `#[account(close = )]` constraint. This constraint handles everything required to securely close an account: -1. Transfers the account’s lamports to the given `` +1. Transfers the account's lamports to the given `` 2. Zeroes out the account data 3. Sets the account discriminator to the `CLOSED_ACCOUNT_DISCRIMINATOR` variant @@ -258,8 +258,8 @@ pub struct CloseAccount { } ``` -The `force_defund` instruction is an optional addition that you’ll have to -implement on your own if you’d like to utilize it. +The `force_defund` instruction is an optional addition that you'll have to +implement on your own if you'd like to utilize it. ## Lab diff --git a/content/courses/program-security/duplicate-mutable-accounts.md b/content/courses/program-security/duplicate-mutable-accounts.md index c82a49f09..b52f1e29e 100644 --- a/content/courses/program-security/duplicate-mutable-accounts.md +++ b/content/courses/program-security/duplicate-mutable-accounts.md @@ -172,7 +172,7 @@ pub struct User { ## Lab -Let’s practice by creating a simple Rock Paper Scissors program to demonstrate +Let's practice by creating a simple Rock Paper Scissors program to demonstrate how failing to check for duplicate mutable accounts can cause undefined behavior within your program. @@ -356,9 +356,9 @@ pub struct RockPaperScissorsSecure<'info> { ### Test rock_paper_scissors_shoot_secure instruction -To test the `rock_paper_scissors_shoot_secure` instruction, we’ll invoke the -instruction twice. First, we’ll invoke the instruction using two different -player accounts to check that the instruction works as intended. Then, we’ll +To test the `rock_paper_scissors_shoot_secure` instruction, we'll invoke the +instruction twice. First, we'll invoke the instruction using two different +player accounts to check that the instruction works as intended. Then, we'll invoke the instruction using the `playerOne.publicKey` as both player accounts, which we expect to fail. diff --git a/content/courses/program-security/owner-checks.md b/content/courses/program-security/owner-checks.md index a30103dc8..99466ac52 100644 --- a/content/courses/program-security/owner-checks.md +++ b/content/courses/program-security/owner-checks.md @@ -645,7 +645,7 @@ debugging. Ensuring account ownership checks is critical to avoid security vulnerabilities. This example demonstrates how simple it is to implement proper validation, but -it’s vital to always verify which accounts are owned by specific programs. +it's vital to always verify which accounts are owned by specific programs. If you'd like to review the final solution code, it's available on the [`solution` branch of the repository](https://github.com/solana-developers/owner-checks/tree/solution). diff --git a/content/courses/program-security/pda-sharing.md b/content/courses/program-security/pda-sharing.md index a57bf4a95..7b70de4a7 100644 --- a/content/courses/program-security/pda-sharing.md +++ b/content/courses/program-security/pda-sharing.md @@ -3,7 +3,7 @@ title: PDA Sharing objectives: - Explain the security risks associated with PDA sharing - Derive PDAs that have discrete authority domains - - Use Anchor’s `seeds` and `bump` constraints to validate PDA accounts + - Use Anchor's `seeds` and `bump` constraints to validate PDA accounts description: "Understand the potential problems of reusing PDAs by using user and domain specific PDAs." @@ -15,7 +15,7 @@ description: possibility of users accessing data and funds that don't belong to them - Prevent the same PDA from being used for multiple accounts by using seeds that are user and/or domain-specific -- Use Anchor’s `seeds` and `bump` constraints to validate that a PDA is derived +- Use Anchor's `seeds` and `bump` constraints to validate that a PDA is derived using the expected seeds and bump ## Lesson @@ -155,7 +155,7 @@ pub struct TokenPool { } ``` -### Anchor’s seeds and bump Constraints +### Anchor's seeds and bump Constraints PDAs can be used as both the address of an account and allow programs to sign for the PDAs they own. @@ -165,7 +165,7 @@ the address of the `pool` account and the owner of the `vault` token account. This means that only the `pool` account associated with the correct `vault` and `withdraw_destination` can be used in the `withdraw_tokens` instruction handler. -You can use Anchor’s `seeds` and `bump` constraints with the +You can use Anchor's `seeds` and `bump` constraints with the [`#[account(...)]`](https://www.anchor-lang.com/docs/account-constraints) attribute to validate the `pool` account PDA. Anchor derives a PDA using the `seeds` and `bump` specified and compares it against the account passed into the @@ -234,8 +234,8 @@ pub struct TokenPool { ## Lab -Let’s practice by creating a simple program to demonstrate how PDA sharing can -allow an attacker to withdraw tokens that don’t belong to them. This lab expands +Let's practice by creating a simple program to demonstrate how PDA sharing can +allow an attacker to withdraw tokens that don't belong to them. This lab expands on the examples above by including the instruction handlers to initialize the required program accounts. @@ -517,7 +517,7 @@ it("prevents secure withdrawal to incorrect destination", async () => { ``` Lastly, since the `pool` account is a PDA derived using the -`withdraw_destination` token account, we can’t create a fake `pool` account +`withdraw_destination` token account, we can't create a fake `pool` account using the same PDA. Add one more test showing that the new `initialize_pool_secure` instruction handler won't let an attacker put in the wrong vault. diff --git a/content/courses/program-security/security-intro.md b/content/courses/program-security/security-intro.md index 468be61db..f3a17c022 100644 --- a/content/courses/program-security/security-intro.md +++ b/content/courses/program-security/security-intro.md @@ -10,7 +10,7 @@ description: ## Overview This course aims to introduce you to a range of common security exploits unique -to Solana development. We’ve modeled this course heavily on Coral's +to Solana development. We've modeled this course heavily on Coral's [Sealevel Attacks](https://github.com/coral-xyz/sealevel-attacks) repository. Program security is covered in our @@ -32,7 +32,7 @@ While the first few lessons in this course cover topics similar to those in the [Anchor course](/content/courses/onchain-development/intro-to-anchor.md) or [Program Security lesson](/content/courses/native-onchain-development/program-security.md) in the [Native Course](/content/courses/native-onchain-development.md), but as -you progress, you’ll encounter new types of attacks. We encourage you to explore +you progress, you'll encounter new types of attacks. We encourage you to explore all of them. diff --git a/content/courses/program-security/signer-auth.md b/content/courses/program-security/signer-auth.md index 897291264..34777b07f 100644 --- a/content/courses/program-security/signer-auth.md +++ b/content/courses/program-security/signer-auth.md @@ -3,8 +3,8 @@ title: Signer Authorization objectives: - Explain the security risks of not performing appropriate signer checks. - Implement signer checks using native Rust - - Implement signer checks using Anchor’s `Signer` type - - Implement signer checks using Anchor’s `#[account(signer)]` constraint + - Implement signer checks using Anchor's `Signer` type + - Implement signer checks using Anchor's `#[account(signer)]` constraint description: "Ensure instructions are only executed by authorized accounts by implementing signer checks." @@ -52,7 +52,7 @@ the instruction handler matches the `authority` field on the `vault` account, there is no verification that the `authority` account actually authorized the transaction. -This omission allows an attacker to pass in the `authority` account’s public key +This omission allows an attacker to pass in the `authority` account's public key and their own public key as the `new_authority` account, effectively reassigning themselves as the new authority of the `vault` account. Once they have control, they can interact with the program as the new authority. @@ -146,7 +146,7 @@ pub struct Vault { } ``` -### Use Anchor’s Signer Account Type +### Use Anchor's Signer Account Type Incorporating the [`signer`](https://docs.rs/anchor-lang/latest/anchor_lang/accounts/signer/struct.Signer.html) @@ -195,7 +195,7 @@ When you use the `Signer` type, no other ownership or type checks are performed. -### Using Anchor’s `#[account(signer)]` Constraint +### Using Anchor's `#[account(signer)]` Constraint While the `Signer` account type is useful, it doesn't perform other ownership or type checks, limiting its use in instruction handler logic. The @@ -277,7 +277,7 @@ authority. The `vault` PDA will be the authority of the token account, enabling the program to sign off on token transfers. The `insecure_withdraw` instruction handler transfers tokens from the `vault` -account’s token account to a `withdraw_destination` token account. However, the +account's token account to a `withdraw_destination` token account. However, the `authority` account in the `InsecureWithdraw` struct is of type `UncheckedAccount`, a wrapper around `AccountInfo` that explicitly indicates the account is unchecked. diff --git a/content/courses/solana-pay/solana-pay.md b/content/courses/solana-pay/solana-pay.md index 7cd0baebe..b8b8a0cf9 100644 --- a/content/courses/solana-pay/solana-pay.md +++ b/content/courses/solana-pay/solana-pay.md @@ -603,7 +603,7 @@ async function buildTransaction( #### 6. Implement the `buildTransaction` function -Next, let’s implement the `buildTransaction` function. It should build, +Next, let's implement the `buildTransaction` function. It should build, partially sign, and return the check-in transaction. The sequence of items it needs to perform is: @@ -743,7 +743,7 @@ async function fetchUserState(account: PublicKey): Promise { #### 8. Implement `verifyCorrectLocation` function -Next, let’s implement the `verifyCorrectLocation` helper function. This function +Next, let's implement the `verifyCorrectLocation` helper function. This function is used to verify that a user is at the correct location in a scavenger hunt game. diff --git a/content/courses/state-compression/compressed-nfts.md b/content/courses/state-compression/compressed-nfts.md index 461bec325..e0a7ba59d 100644 --- a/content/courses/state-compression/compressed-nfts.md +++ b/content/courses/state-compression/compressed-nfts.md @@ -1,7 +1,7 @@ --- title: Compressed NFTs objectives: - - Create a compressed NFT collection using Metaplex’s Bubblegum program + - Create a compressed NFT collection using Metaplex's Bubblegum program - Mint compressed NFTs using the Bubblegum TS SDK - Transfer compressed NFTs using the Bubblegum TS SDK - Read compressed NFT data using the Read API @@ -15,8 +15,8 @@ description: - **Compressed NFTs (cNFTs)** use **State Compression** to hash NFT data and store the hash onchain in an account using a **concurrent Merkle tree** structure -- The cNFT data hash can’t be used to infer the cNFT data, but it can be used to - **verify** if the cNFT data you’re seeing is correct +- The cNFT data hash can't be used to infer the cNFT data, but it can be used to + **verify** if the cNFT data you're seeing is correct - Supporting RPC providers **index** cNFT data offchain when the cNFT is minted so that you can use the **Read API** to access the data - The **Metaplex Bubblegum program** is an abstraction on top of the **State @@ -30,7 +30,7 @@ structure takes up less account storage than traditional NFTs. Compressed NFTs leverage a concept called **State Compression** to store data in a way that drastically reduces costs. -Solana’s transaction costs are so cheap that most users never think about how +Solana's transaction costs are so cheap that most users never think about how expensive minting NFTs can be at scale. The cost to set up and mint 1 million traditional NFTs is approximately 24,000 SOL. By comparison, cNFTs can be structured to where the same setup and mint costs 10 SOL or less. That means @@ -40,14 +40,14 @@ over traditional NFTs. However, cNFTs can be tricky to work with. Eventually, the tooling required to work with them will be sufficiently abstracted from the underlying technology that the developer experience between traditional NFTs and cNFTs will be -negligible. But for now, you’ll still need to understand the low level puzzle -pieces, so let’s dig in! +negligible. But for now, you'll still need to understand the low level puzzle +pieces, so let's dig in! ### A theoretical overview of cNFTs Most of the costs associated with traditional NFTs come down to account storage space. Compressed NFTs use a concept called State Compression to store data in -the blockchain’s cheaper **ledger state**, using more expensive account space +the blockchain's cheaper **ledger state**, using more expensive account space only to store a “fingerprint”, or **hash**, of the data. This hash allows you to cryptographically verify that data has not been tampered with. @@ -71,20 +71,20 @@ are: truth” can go through the same process and compare the final hash without having to store all the data onchain -One problem not addressed in the above is how to make data available if it can’t +One problem not addressed in the above is how to make data available if it can't be fetched from an account. Since this hashing process occurs onchain, all the data exists in the ledger state and could theoretically be retrieved from the original transaction by replaying the entire chain state from origin. However, -it’s much more straightforward (though still complicated) to have an **indexer** +it's much more straightforward (though still complicated) to have an **indexer** track and index this data as the transactions occur. This ensures there is an offchain “cache” of the data that anyone can access and subsequently verify against the onchain root hash. -This process is _very complex_. We’ll cover some of the key concepts below but -don’t worry if you don’t understand it right away. We’ll talk more theory in the +This process is _very complex_. We'll cover some of the key concepts below but +don't worry if you don't understand it right away. We'll talk more theory in the state compression lesson and focus primarily on application to NFTs in this -lesson. You’ll be able to work with cNFTs by the end of this lesson even if you -don’t fully understand every piece of the state compression puzzle. +lesson. You'll be able to work with cNFTs by the end of this lesson even if you +don't fully understand every piece of the state compression puzzle. #### Concurrent Merkle trees @@ -168,7 +168,7 @@ forever exist on the ledger state. #### Index data for easy lookup Under normal conditions, you would typically access onchain data by fetching the -appropriate account. When using state compression, however, it’s not so +appropriate account. When using state compression, however, it's not so straightforward. As mentioned above, the data now exists in the ledger state rather than in an @@ -177,37 +177,37 @@ instruction, but while this data will in a sense exist in the ledger state forever, it will likely be inaccessible through validators after a certain period of time. -To save space and be more performant, validators don’t retain every transaction -back to the genesis block. The specific amount of time you’ll be able to access +To save space and be more performant, validators don't retain every transaction +back to the genesis block. The specific amount of time you'll be able to access the Noop instruction logs related to your data will vary based on the validator, -but eventually you’ll lose access to it if you’re relying directly on +but eventually you'll lose access to it if you're relying directly on instruction logs. Technically, you _can_ replay transaction state back to the genesis block but -the average team isn’t going to do that, and it certainly won’t be performant. +the average team isn't going to do that, and it certainly won't be performant. Instead, you should use an indexer that will observe the events sent to the Noop -program and store the relevant data off chain. That way you don’t need to worry +program and store the relevant data off chain. That way you don't need to worry about old data becoming inaccessible. ### Create a cNFT Collection -With the theoretical background out of the way, let’s turn our attention to the +With the theoretical background out of the way, let's turn our attention to the main point of this lesson: how to create a cNFT collection. Fortunately, you can use tools created by Solana Foundation, the Solana -developer community, and Metaplex to simplify the process. Specifically, we’ll +developer community, and Metaplex to simplify the process. Specifically, we'll be using the `@solana/spl-account-compression` SDK, the Metaplex Bubblegum -program, and the Bubblegum program’s corresponding TS SDK +program, and the Bubblegum program's corresponding TS SDK `@metaplex-foundation/mpl-bugglegum`. #### Prepare metadata -Prior to starting, you’ll prepare your NFT metadata similarly to how you would +Prior to starting, you'll prepare your NFT metadata similarly to how you would if you were using a Candy Machine. At its core, an NFT is simply a token with metadata that follows the NFT standard. In other words, it should be shaped something like this: @@ -237,13 +237,13 @@ something like this: ``` Depending on your use case, you may be able to generate this dynamically or you -might want to have a JSON file prepared for each cNFT beforehand. You’ll also +might want to have a JSON file prepared for each cNFT beforehand. You'll also need any other assets referenced by the JSON, such as the `image` url shown in the example above. #### Create Collection NFT -If you want your cNFTs to be part of a collection, you’ll need to create a +If you want your cNFTs to be part of a collection, you'll need to create a Collection NFT **before** you start minting cNFTs. This is a traditional NFT that acts as the reference binding your cNFTs together into a single collection. You can create this NFT using the `@metaplex-foundation/js` library. Just make @@ -315,19 +315,19 @@ the max depth, while the buffer size will determine the number of concurrent changes (mints, transfers, etc.) within the same slot that can occur to the tree. In other words, choose the max depth that corresponds to the number of NFTs you need the tree to hold, then choose one of the options for max buffer -size based on the traffic you expect you’ll need to support. +size based on the traffic you expect you'll need to support. Next, choose the canopy depth. Increasing the canopy depth increases the -composability of your cNFTs. Any time your or another developer’s code attempts +composability of your cNFTs. Any time your or another developer's code attempts to verify a cNFT down the road, the code will have to pass in as many proof -nodes as there are “layers” in your tree. So for a max depth of 20, you’ll need +nodes as there are “layers” in your tree. So for a max depth of 20, you'll need to pass in 20 proof nodes. Not only is this tedious, but since each proof node -is 32 bytes it’s possible to max out transaction sizes very quickly. +is 32 bytes it's possible to max out transaction sizes very quickly. For example, if your tree has a very low canopy depth, an NFT marketplace may only be able to support simple NFTs transfers rather than support an onchain bidding system for your cNFTs. The canopy effectively caches proof nodes onchain -so you don’t have to pass all of them into the transaction, allowing for more +so you don't have to pass all of them into the transaction, allowing for more complex transactions. Increasing any of these three values increases the size of the account, thereby @@ -354,7 +354,7 @@ const allocTreeIx = await createAllocTreeIx( Note that this is simply a helper function for calculating the size required by the account and creating the instruction to send to the System Program for -allocating the account. This function doesn’t interact with any +allocating the account. This function doesn't interact with any compression-specific programs yet. #### Use Bubblegum to Initialize Your Tree @@ -365,8 +365,8 @@ tree config account to add cNFT-specific tracking and functionality. Version 0.7 of the `@metaplex-foundation/mpl-bubblegum` TS SDK provides the helper function `createCreateTreeInstruction` for calling the `create_tree` -instruction on the Bubblegum program. As part of the call, you’ll need to derive -the `treeAuthority` PDA expected by the program. This PDA uses the tree’s +instruction on the Bubblegum program. As part of the call, you'll need to derive +the `treeAuthority` PDA expected by the program. This PDA uses the tree's address as a seed. ```typescript @@ -442,7 +442,7 @@ Feel free to take a look at the program code #### Mint cNFTs With the Merkle tree account and its corresponding Bubblegum tree config account -initialized, it’s possible to mint cNFTs to the tree. The Bubblegum instruction +initialized, it's possible to mint cNFTs to the tree. The Bubblegum instruction to use will be either `mint_v1` or `mint_to_collection_v1`, depending on whether or not you want to the minted cNFT to be part of a collection. @@ -537,7 +537,7 @@ const mintWithoutCollectionIx = createMintV1Instruction( ### Interact with cNFTs -It’s important to note that cNFTs _are not_ SPL tokens. That means your code +It's important to note that cNFTs _are not_ SPL tokens. That means your code needs to follow different conventions to handle cNFT functionality like fetching, querying, transferring, etc. @@ -546,24 +546,24 @@ fetching, querying, transferring, etc. The simplest way to fetch data from an existing cNFT is to use the [Digital Asset Standard Read API](https://solana.com/developers/guides/javascript/compressed-nfts#reading-compressed-nfts-metadata) (Read API). Note that this is separate from the standard JSON RPC. To use the -Read API, you’ll need to use a supporting RPC Provider. Metaplex maintains a +Read API, you'll need to use a supporting RPC Provider. Metaplex maintains a (likely non-exhaustive) [list of RPC providers](https://developers.metaplex.com/bubblegum/rpcs) that -support the Read API. In this lesson we’ll be using +support the Read API. In this lesson we'll be using [Helius](https://docs.helius.dev/compression-and-das-api/digital-asset-standard-das-api) as they have free support for Devnet. -To use the Read API to fetch a specific cNFT, you need to have the cNFT’s asset -ID. However, after minting cNFTs, you’ll have at most two pieces of information: +To use the Read API to fetch a specific cNFT, you need to have the cNFT's asset +ID. However, after minting cNFTs, you'll have at most two pieces of information: 1. The transaction signature 2. The leaf index (possibly) -The only real guarantee is that you’ll have the transaction signature. It is +The only real guarantee is that you'll have the transaction signature. It is **possible** to locate the leaf index from there, but it involves some fairly complex parsing. The short story is you must retrieve the relevant instruction -logs from the Noop program and parse them to find the leaf index. We’ll cover -this more in depth in a future lesson. For now, we’ll assume you know the leaf +logs from the Noop program and parse them to find the leaf index. We'll cover +this more in depth in a future lesson. For now, we'll assume you know the leaf index. This is a reasonable assumption for most mints given that the minting will be @@ -571,7 +571,7 @@ controlled by your code and can be set up sequentially so that your code can track which index is going to be used for each mint. I.e. the first mint will use index 0, the second index 1, etc. -Once you have the leaf index, you can derive the cNFT’s corresponding asset ID. +Once you have the leaf index, you can derive the cNFT's corresponding asset ID. When using Bubblegum, the asset ID is a PDA derived using the Bubblegum program ID and the following seeds: @@ -607,7 +607,7 @@ const { result } = await response.json(); console.log(JSON.stringify(result, null, 2)); ``` -This will return a JSON object that is comprehensive of what a traditional NFT’s +This will return a JSON object that is comprehensive of what a traditional NFT's on- and offchain metadata would look like combined. For example, you can find the cNFT attributes at `content.metadata.attributes` or the image at `content.files.uri`. @@ -626,30 +626,30 @@ and more. For example, Helius supports the following methods: - `getAssetsByCreator` - `getAssetsByGroup` -We won’t go over most of these directly, but be sure to look through the +We won't go over most of these directly, but be sure to look through the [Helius docs](https://docs.helius.dev/compression-and-das-api/digital-asset-standard-das-api) to learn how to use them correctly. #### Transfer cNFTs Just as with a standard SPL token transfer, security is paramount. An SPL token -transfer, however, makes verifying transfer authority very easy. It’s built into -the SPL Token program and standard signing. A compressed token’s ownership is +transfer, however, makes verifying transfer authority very easy. It's built into +the SPL Token program and standard signing. A compressed token's ownership is more difficult to verify. The actual verification will happen program-side, but your client-side code needs to provide additional information to make it possible. While there is a Bubblegum `createTransferInstruction` helper function, there is more assembly required than usual. Specifically, the Bubblegum program needs to -verify that the entirety of the cNFT’s data is what the client asserts before a +verify that the entirety of the cNFT's data is what the client asserts before a transfer can occur. The entirety of the cNFT data has been hashed and stored as a single leaf on the Merkle tree, and the Merkle tree is simply a hash of all -the tree’s leafs and branches. Because of this, you can’t simply tell the -program what account to look at and have it compare that account’s `authority` +the tree's leafs and branches. Because of this, you can't simply tell the +program what account to look at and have it compare that account's `authority` or `owner` field to the transaction signer. Instead, you need to provide the entirety of the cNFT data and any of the Merkle -tree’s proof information that isn’t stored in the canopy. That way, the program +tree's proof information that isn't stored in the canopy. That way, the program can independently prove that the provided cNFT data, and therefore the cNFT owner, is accurate. Only then can the program safely determine if the transaction signer should, in fact, be allowed to transfer the cNFT. @@ -710,7 +710,7 @@ const treeAccount = await ConcurrentMerkleTreeAccount.fromAccountAddress( ``` Step four is the most conceptually challenging step. Using the three pieces of -information gathered, you’ll need to assemble the proof path for the cNFT’s +information gathered, you'll need to assemble the proof path for the cNFT's corresponding leaf. The proof path is represented as accounts passed to the program instruction. The program uses each of the account addresses as proof nodes to prove the leaf data is what you say it is. @@ -758,7 +758,7 @@ function, `createTransferInstruction`, requires the following arguments: first - `nonce` - used to ensure that no two leafs have the same hash; this value should be the same as `index` - - `index` - the index where the cNFT’s leaf is located on the Merkle tree + - `index` - the index where the cNFT's leaf is located on the Merkle tree An example of this is shown below. Note that the first 3 lines of code grab additional information nested in the objects shown previously so they are ready @@ -798,23 +798,23 @@ const transferIx = createTransferInstruction( ### Conclusion -We’ve covered the primary skills needed to interact with cNFTs, but haven’t been +We've covered the primary skills needed to interact with cNFTs, but haven't been fully comprehensive. You can also use Bubblegum to do things like burn, verify, -delegate, and more. We won’t go through these, but these instructions are +delegate, and more. We won't go through these, but these instructions are similar to the mint and transfer process. If you need this additional functionality, take a look at the [Bubblegum client source code](https://github.com/metaplex-foundation/mpl-bubblegum/tree/main/clients/js-solita) and leverage the helper functions it provides. Keep in mind that compression is fairly new. Available tooling will evolve -rapidly but the principles you’ve learned in this lesson will likely remain the +rapidly but the principles you've learned in this lesson will likely remain the same. These principles can also be broadened to arbitrary state compression, so -be sure to master them here so you’re ready for more fun stuff in future +be sure to master them here so you're ready for more fun stuff in future lessons! ## Lab -Let’s jump in and practice creating and working with cNFTs. Together, we’ll +Let's jump in and practice creating and working with cNFTs. Together, we'll build as simple a script as possible that will let us mint a cNFT collection from a Merkle tree. @@ -835,24 +835,24 @@ in `uri.ts`. The `uri.ts` file provides 10k URIs that you can use for the offchain portion of your NFT metadata. You can, of course, create your own metadata. But this lesson -isn’t explicitly about preparing metadata so we’ve provided some for you. +isn't explicitly about preparing metadata so we've provided some for you. The `utils.ts` file has a few helper functions to keep you from writing more unnecessary boilerplate than you need to. They are as follows: - `getOrCreateKeypair` will create a new keypair for you and save it to a `.env` - file, or if there’s already a private key in the `.env` file it will + file, or if there's already a private key in the `.env` file it will initialize a keypair from that. - `airdropSolIfNeeded` will airdrop some Devnet SOL to a specified address if - that address’s balance is below 1 SOL. + that address's balance is below 1 SOL. - `createNftMetadata` will create the NFT metadata for a given creator public - key and index. The metadata it’s getting is just dummy metadata using the URI + key and index. The metadata it's getting is just dummy metadata using the URI corresponding to the provided index from the `uri.ts` list of URIs. - `getOrCreateCollectionNFT` will fetch the collection NFT from the address specified in `.env` or if there is none it will create a new one and add the address to `.env`. -Finally, there’s some boilerplate in `index.ts` that calls creates a new Devnet +Finally, there's some boilerplate in `index.ts` that calls creates a new Devnet connection, calls `getOrCreateKeypair` to initialize a “wallet,” and calls `airdropSolIfNeeded` to fund the wallet if its balance is low. @@ -860,20 +860,20 @@ We will be writing all of our code in the `index.ts`. #### 2. Create the Merkle tree account -We’ll start by creating the Merkle tree account. Let’s encapsulate this in a -function that will eventually create _and_ initialize the account. We’ll put it -below our `main` function in `index.ts`. Let’s call it +We'll start by creating the Merkle tree account. Let's encapsulate this in a +function that will eventually create _and_ initialize the account. We'll put it +below our `main` function in `index.ts`. Let's call it `createAndInitializeTree`. For this function to work, it will need the following parameters: - `connection` - a `Connection` to use for interacting with the network. - `payer` - a `Keypair` that will pay for transactions. - `maxDepthSizePair` - a `ValidDepthSizePair`. This type comes from - `@solana/spl-account-compression`. It’s a simple object with properties + `@solana/spl-account-compression`. It's a simple object with properties `maxDepth` and `maxBufferSize` that enforces a valid combination of the two values. - `canopyDepth` - a number for the canopy depth In the body of the function, - we’ll generate a new address for the tree, then create the instruction for + we'll generate a new address for the tree, then create the instruction for allocating a new Merkle tree account by calling `createAllocTreeIx` from `@solana/spl-account-compression`. @@ -910,15 +910,15 @@ This instruction needs us to provide the following: and the Bubblegum program - `merkleTree` - the address of the Merkle tree - `payer` - the transaction fee payer - - `treeCreator` - the address of the tree creator; we’ll make this the same as + - `treeCreator` - the address of the tree creator; we'll make this the same as `payer` - `logWrapper` - make this the `SPL_NOOP_PROGRAM_ID` - `compressionProgram` - make this the `SPL_ACCOUNT_COMPRESSION_PROGRAM_ID` - `args` - a list of instruction arguments; this includes: - - `maxBufferSize` - the buffer size from our function’s `maxDepthSizePair` + - `maxBufferSize` - the buffer size from our function's `maxDepthSizePair` parameter - - `maxDepth` - the max depth from our function’s `maxDepthSizePair` parameter - - `public` - whether or no the tree should be public; we’ll set this to + - `maxDepth` - the max depth from our function's `maxDepthSizePair` parameter + - `public` - whether or no the tree should be public; we'll set this to `false` Finally, we can add both instructions to a transaction and submit the @@ -1023,15 +1023,15 @@ run the following: #### 4. Mint cNFTs to your tree -Believe it or not, that’s all you needed to do to set up your tree to compressed -NFTs! Now let’s turn our attention to minting. +Believe it or not, that's all you needed to do to set up your tree to compressed +NFTs! Now let's turn our attention to minting. -First, let’s declare a function called `mintCompressedNftToCollection`. It will +First, let's declare a function called `mintCompressedNftToCollection`. It will need the following parameters: - `connection` - a `Connection` to use for interacting with the network. - `payer` - a `Keypair` that will pay for transactions. -- `treeAddress` - the Merkle tree’s address +- `treeAddress` - the Merkle tree's address - `collectionDetails` - the details of the collection as type `CollectionDetails` from `utils.ts` - `amount` - the number of cNFTs to mint @@ -1052,7 +1052,7 @@ The body of this function will do the following: The `createMintToCollectionV1Instruction` takes two arguments: `accounts` and `args`. The latter is simply the NFT metadata. As with all complex instructions, -the primary hurdle is knowing which accounts to provide. So let’s go through +the primary hurdle is knowing which accounts to provide. So let's go through them real quick: - `payer` - the account that will pay for the transaction fees, rent, etc. @@ -1082,7 +1082,7 @@ them real quick: - `tokenMetadataProgram` - the token metadata program that was used for the collection NFT; this is usually always the Metaplex Token Metadata program -When you put it all together, this is what it’ll look like: +When you put it all together, this is what it'll look like: ```typescript async function mintCompressedNftToCollection( @@ -1200,16 +1200,16 @@ Again, to run, in your terminal type: `npm run start` #### 5. Read existing cNFT data -Now that we’ve written code to mint cNFTs, let’s see if we can actually fetch +Now that we've written code to mint cNFTs, let's see if we can actually fetch their data. This is tricky because the onchain data is just the Merkle tree account, the data from which can be used to verify existing information as accurate but is useless in conveying what the information is. -Let’s start by declaring a function `logNftDetails` that takes as parameters +Let's start by declaring a function `logNftDetails` that takes as parameters `treeAddress` and `nftsMinted`. -At this point we don’t actually have a direct identifier of any kind that points -to our cNFT. To get that, we’ll need to know the leaf index that was used when +At this point we don't actually have a direct identifier of any kind that points +to our cNFT. To get that, we'll need to know the leaf index that was used when we minted our cNFT. We can then use that to derive the asset ID used by the Read API and subsequently use the Read API to fetch our cNFT data. @@ -1219,9 +1219,9 @@ function from `@metaplex-foundation/mpl-bubblegum` to get the asset ID. Finally, we can use an RPC that supports the [Read API](https://solana.com/developers/guides/javascript/compressed-nfts) to -fetch the asset. We’ll be using +fetch the asset. We'll be using [Helius](https://docs.helius.dev/compression-and-das-api/digital-asset-standard-das-api), -but feel free to choose your own RPC provider. To use Helius, you’ll need to get +but feel free to choose your own RPC provider. To use Helius, you'll need to get a free API Key from [the Helius website](https://dev.helius.xyz/). Then add your `RPC_URL` to your `.env` file. For example: @@ -1262,8 +1262,8 @@ surface that data when requested. If we add a call to this function at the end of `main` and re-run your script, the data we get back in the console is very comprehensive. It includes all of -the data you’d expect in both the onchain and offchain portion of a traditional -NFT. You can find the cNFT’s attributes, files, ownership and creator +the data you'd expect in both the onchain and offchain portion of a traditional +NFT. You can find the cNFT's attributes, files, ownership and creator information, and more. ```json @@ -1360,11 +1360,11 @@ information, and more. Remember, the Read API also includes ways to get multiple assets, query by owner, creator, etc., and more. Be sure to look through the [Helius docs](https://docs.helius.dev/compression-and-das-api/digital-asset-standard-das-api) -to see what’s available. +to see what's available. #### 6. Transfer a cNFT -The last thing we’re going to add to our script is a cNFT transfer. Just as with +The last thing we're going to add to our script is a cNFT transfer. Just as with a standard SPL token transfer, security is paramount. Unlike with a standard SPL token transfer, however, to build a secure transfer with state compression of any kind, the program performing the transfer needs the entire asset data. @@ -1382,15 +1382,15 @@ Remember, the general steps are: 4. Prepare the asset proof as a list of `AccountMeta` objects 5. Build and send the Bubblegum transfer instruction -Let’s start by declaring a `transferNft` function that takes the following: +Let's start by declaring a `transferNft` function that takes the following: - `connection` - a `Connection` object - `assetId` - a `PublicKey` object - `sender` - a `Keypair` object so we can sign the transaction - `receiver` - a `PublicKey` object representing the new owner -Inside that function, let’s fetch the asset data again then also fetch the asset -proof. For good measure, let’s wrap everything in a `try catch`. +Inside that function, let's fetch the asset data again then also fetch the asset +proof. For good measure, let's wrap everything in a `try catch`. ```typescript async function transferNft( @@ -1434,7 +1434,7 @@ async function transferNft( } ``` -Next, let’s fetch the Merkle tree account from the chain, get the canopy depth, +Next, let's fetch the Merkle tree account from the chain, get the canopy depth, and assemble the proof path. We do this by mapping the asset proof we got from Helius to a list of `AccountMeta` objects, then removing any proof nodes at the end that are already cached onchain in the canopy. @@ -1578,10 +1578,10 @@ async function transferNft( } ``` -Lets transfer our first compressed NFT at index 0 to someone else. First we’ll +Lets transfer our first compressed NFT at index 0 to someone else. First we'll need to spin up another wallet with some funds, then grab the assetID at index 0 -using `getLeafAssetId`. Then we’ll do the transfer. Finally, we’ll print out the -entire collection using our function `logNftDetails`. You’ll note that the NFT +using `getLeafAssetId`. Then we'll do the transfer. Finally, we'll print out the +entire collection using our function `logNftDetails`. You'll note that the NFT at index zero will now belong to our new wallet in the `ownership` field. ```typescript @@ -1646,12 +1646,12 @@ take a look at the solution code on the `solution` branch of the ### Challenge -It’s your turn to take these concepts for a spin on your own! We’re not going to +It's your turn to take these concepts for a spin on your own! We're not going to be overly prescriptive at this point, but here are some ideas: 1. Create your own production cNFT collection -2. Build a UI for this lesson’s lab that will let you mint a cNFT and display it -3. See if you can replicate some of the lab script’s functionality in an onchain +2. Build a UI for this lesson's lab that will let you mint a cNFT and display it +3. See if you can replicate some of the lab script's functionality in an onchain program, i.e. write a program that can mint cNFTs diff --git a/content/courses/state-compression/generalized-state-compression.md b/content/courses/state-compression/generalized-state-compression.md index 4333e308d..f91169e38 100644 --- a/content/courses/state-compression/generalized-state-compression.md +++ b/content/courses/state-compression/generalized-state-compression.md @@ -27,8 +27,8 @@ description: Previously, we discussed state compression in the context of compressed NFTs. At the time of writing, compressed NFTs represent the most common use case for -state compression, but it’s possible to use state compression within any -program. In this lesson, we’ll discuss state compression in more generalized +state compression, but it's possible to use state compression within any +program. In this lesson, we'll discuss state compression in more generalized terms so that you can apply it to any of your programs. ### A theoretical overview of state compression @@ -36,7 +36,7 @@ terms so that you can apply it to any of your programs. In traditional programs, data is serialized (typically using borsh) and then stored directly in an account. This allows the data to be easily read and written through Solana programs. You can “trust” the data stored in the accounts -because it can’t be modified except through the mechanisms surfaced by the +because it can't be modified except through the mechanisms surfaced by the program. State compression effectively asserts that the most important piece of this @@ -113,7 +113,7 @@ subsequent writes to successfully occur. This includes: successful. 3. A canopy - When performing an update action on any given leaf, you need the entire proof path from that leaf to the root hash. The canopy stores - intermediate proof nodes along that path so they don’t all have to be passed + intermediate proof nodes along that path so they don't all have to be passed into the program from the client. As a program architect, you control three values directly related to these three @@ -163,17 +163,17 @@ The answer is 20. Choosing a max buffer size is effectively a question of throughput: how many concurrent writes do you need? The larger the buffer, the higher the throughput. -Lastly, the canopy depth will determine your program’s composability. State +Lastly, the canopy depth will determine your program's composability. State compression pioneers have made it clear that omitting a canopy is a bad idea. -Program A can’t call your state-compressed program B if doing so maxes out the +Program A can't call your state-compressed program B if doing so maxes out the transaction size limits. Remember, program A also has required accounts and data in addition to required proof paths, each of which take up transaction space. #### Data access on a state-compressed program -A state-compressed account doesn’t store the data itself. Rather, it stores the +A state-compressed account doesn't store the data itself. Rather, it stores the concurrent Merkle tree structure discussed above. The raw data itself lives only -in the blockchain’s cheaper **ledger state.** This makes data access somewhat +in the blockchain's cheaper **ledger state.** This makes data access somewhat more difficult, but not impossible. The Solana ledger is a list of entries containing signed transactions. In @@ -183,7 +183,7 @@ data that has ever been put into a transaction exists in the ledger. Since the state compression hashing process occurs onchain, all the data exists in the ledger state and could theoretically be retrieved from the original transaction by replaying the entire chain state from the beginning. However, -it’s much more straightforward (though still complicated) to have +it's much more straightforward (though still complicated) to have an **indexer** track and index this data as the transactions occur. This ensures there is an offchain “cache” of the data that anyone can access and subsequently verify against the onchain root hash. @@ -193,7 +193,7 @@ This process is complex, but it will make sense after some practice. ### State compression tooling The theory described above is essential to properly understanding state -compression. But you don’t have to implement any of it from scratch. Brilliant +compression. But you don't have to implement any of it from scratch. Brilliant engineers have laid most of the groundwork for you in the form of the SPL State Compression Program and the Noop Program. @@ -209,12 +209,12 @@ primary purpose is to make leaf data easier to index by logging it to the ledger state. When you want to store compressed data, you pass it to the State Compression program where it gets hashed and emitted as an “event” to the Noop program. The hash gets stored in the corresponding concurrent Merkle tree, but -the raw data remains accessible through the Noop program’s transaction logs. +the raw data remains accessible through the Noop program's transaction logs. #### Index data for easy lookup Under normal conditions, you would typically access onchain data by fetching the -appropriate account. When using state compression, however, it’s not so +appropriate account. When using state compression, however, it's not so straightforward. As mentioned above, the data now exists in the ledger state rather than in an @@ -223,18 +223,18 @@ instruction. Unfortunately, while this data will in a sense exist in the ledger state forever, it will likely be inaccessible through validators after a certain period of time. -To save space and be more performant, validators don’t retain every transaction -back to the genesis block. The specific amount of time you’ll be able to access +To save space and be more performant, validators don't retain every transaction +back to the genesis block. The specific amount of time you'll be able to access the Noop instruction logs related to your data will vary based on the validator. -Eventually, you’ll lose access to it if you’re relying directly on instruction +Eventually, you'll lose access to it if you're relying directly on instruction logs. Technically, you *can* replay the transaction state back to the genesis block -but the average team isn’t going to do that, and it certainly won’t be +but the average team isn't going to do that, and it certainly won't be performant. The [Digital Asset Standard (DAS)](https://docs.helius.dev/compression-and-das-api/digital-asset-standard-das-api) has been adopted by many RPC providers to enable efficient queries of compressed -NFTs and other assets. However, at the time of writing, it doesn’t support +NFTs and other assets. However, at the time of writing, it doesn't support arbitrary state compression. Instead, you have two primary options: 1. Use an indexing provider that will build a custom indexing solution for your @@ -251,7 +251,7 @@ need to rely on infrastructure providers to handle their indexing. #### Create Rust types As with a typical Anchor program, one of the first things you should do is -define your program’s Rust types. However, Rust types in a traditional Anchor +define your program's Rust types. However, Rust types in a traditional Anchor program often represent accounts. In a state-compressed program, your account state will only store the Merkle tree. The more “usable” data schema will just be serialized and logged to the Noop program. @@ -280,7 +280,7 @@ impl MessageLog { To be abundantly clear, **this is not an account that you will be able to read from**. Your program will be creating an instance of this type from instruction inputs, not constructing an instance of this type from account data that it -reads. We’ll discuss how to read data in a later section. +reads. We'll discuss how to read data in a later section. #### Initialize a new tree @@ -339,14 +339,14 @@ pub fn create_messages_tree( #### Add hashes to the tree -With an initialized Merkle tree, it’s possible to start adding data hashes. This +With an initialized Merkle tree, it's possible to start adding data hashes. This involves passing the uncompressed data to an instruction on your program that will hash the data, log it to the Noop program, and use the State Compression -Program’s `append` instruction to add the hash to the tree. The following +Program's `append` instruction to add the hash to the tree. The following discuss what your instruction needs to do in depth: 1. Use the `hashv` function from the `keccak` crate to hash the data. In most - cases, you’ll want to also hash the owner or authority of the data as well to + cases, you'll want to also hash the owner or authority of the data as well to ensure that it can only be modified by the proper authority. 2. Create a log object representing the data you wish to log to the Noop Program, then call `wrap_application_data_v1` to issue a CPI to the Noop @@ -354,7 +354,7 @@ discuss what your instruction needs to do in depth: available to any client looking for it. For broad use cases like cNFTs, that would be indexers. You might also create your own observing client to simulate what indexers are doing but specific to your application. -3. Build and issue a CPI to the State Compression Program’s `append` +3. Build and issue a CPI to the State Compression Program's `append` instruction. This takes the hash computed in step 1 and adds it to the next available leaf on your Merkle tree. Just as before, this requires the Merkle tree address and the tree authority bump as signature seeds. @@ -413,11 +413,11 @@ as those used to append the initial data to the tree: 1. **Verify update authority** - The first step is new. In most cases, you want to verify update authority. This typically involves proving that the signer of the `update` transaction is the true owner or authority of the leaf at the - given index. Since the data is compressed as a hash on the leaf, we can’t + given index. Since the data is compressed as a hash on the leaf, we can't simply compare the `authority` public key to a stored value. Instead, we need to compute the previous hash using the old data and the `authority` listed in the account validation struct. We then build and issue a CPI to the State - Compression Program’s `verify_leaf` instruction using our computed hash. + Compression Program's `verify_leaf` instruction using our computed hash. 2. **Hash the new data** - This step is the same as the first step from appending initial data. Use the `hashv` function from the `keccak` crate to hash the new data and the update authority, each as their corresponding byte @@ -427,7 +427,7 @@ as those used to append the initial data to the tree: `wrap_application_data_v1` to issue a CPI to the Noop program. 4. **Replace the existing leaf hash** - This step is slightly different than the last step of appending initial data. Build and issue a CPI to the State - Compression Program’s `replace_leaf` instruction. This uses the old hash, the + Compression Program's `replace_leaf` instruction. This uses the old hash, the new hash, and the leaf index to replace the data of the leaf at the given index with the new hash. Just as before, this requires the Merkle tree address and the tree authority bump as signature seeds. @@ -504,8 +504,8 @@ pub fn update_message( #### Delete hashes -At the time of writing, the State Compression Program doesn’t provide an -explicit `delete` instruction. Instead, you’ll want to update leaf data with +At the time of writing, the State Compression Program doesn't provide an +explicit `delete` instruction. Instead, you'll want to update leaf data with data that indicates the data as “deleted.” The specific data will depend on your use case and security concerns. Some may opt to set all data to 0, whereas others might store a static string that all “deleted” items will have in common. @@ -513,13 +513,13 @@ others might store a static string that all “deleted” items will have in com #### Access data from a client The discussion so far has covered 3 of the 4 standard CRUD procedures: Create, -Update, and Delete. What’s left is one of the more difficult concepts in state +Update, and Delete. What's left is one of the more difficult concepts in state compression: reading data. -Accessing data from a client is tricky primarily because the data isn’t stored +Accessing data from a client is tricky primarily because the data isn't stored in a format that is easy to access. The data hashes stored in the Merkle tree -account can’t be used to reconstruct the initial data, and the data logged to -the Noop program isn’t available indefinitely. +account can't be used to reconstruct the initial data, and the data logged to +the Noop program isn't available indefinitely. Your best bet is one of two options: @@ -531,18 +531,18 @@ Your best bet is one of two options: If your project is truly decentralized such that many participants will interact with your program through means other than your own frontend, then option 2 might not be sufficient. However, depending on the scale of the project or -whether or not you’ll have control over most program access, it can be a viable +whether or not you'll have control over most program access, it can be a viable approach. There is no “right” way to do this. Two potential approaches are: 1. Store the raw data in a database at the same time as sending it to the program, along with the leaf that the data is hashed and stored to. -2. Create a server that observes your program’s transactions, looks up the +2. Create a server that observes your program's transactions, looks up the associated Noop logs, decodes the logs, and stores them. -We’ll do a little bit of both when writing tests in this lesson’s lab (though we -won’t persist data in a db - it will only live in memory for the duration of the +We'll do a little bit of both when writing tests in this lesson's lab (though we +won't persist data in a db - it will only live in memory for the duration of the tests). The setup for this is somewhat tedious. Given a particular transaction, you can @@ -619,7 +619,7 @@ development experience, please share with the community! ## Lab -Let’s practice generalized state compression by creating a new Anchor program. +Let's practice generalized state compression by creating a new Anchor program. This program will use custom state compression to power a simple note-taking app. @@ -631,8 +631,8 @@ Start by initializing an Anchor program: anchor init compressed-notes ``` -We’ll be using the `spl-account-compression` crate with the `cpi` feature -enabled. Let’s add it as a dependency in `programs/compressed-notes/Cargo.toml`. +We'll be using the `spl-account-compression` crate with the `cpi` feature +enabled. Let's add it as a dependency in `programs/compressed-notes/Cargo.toml`. ```toml [dependencies] @@ -641,8 +641,8 @@ spl-account-compression = { version="0.2.0", features = ["cpi"] } solana-program = "1.16.0" ``` -We’ll be testing locally but we need both the Compression program and the Noop -program from Mainnet. We’ll need to add these to the `Anchor.toml` in the root +We'll be testing locally but we need both the Compression program and the Noop +program from Mainnet. We'll need to add these to the `Anchor.toml` in the root directory so they get cloned to our local cluster. ```toml @@ -656,7 +656,7 @@ address = "noopb9bkMVfRPU8AsbpTUg8AQkHtKwMYZiFUjNRtMmV" address = "cmtDvXumGCrqC1Age74AVPhSRVXJMd8PJS91L8KbNCK" ``` -Lastly, let’s prepare the `lib.rs` file for the rest of the Demo. Remove the +Lastly, let's prepare the `lib.rs` file for the rest of the Demo. Remove the `initialize` instruction and the `Initialize` accounts struct, then add the imports shown in the code snippet below (be sure to put in **_your_** program id): @@ -689,8 +689,8 @@ pub mod compressed_notes { } ``` -For the rest of this Demo, we’ll be making updates to the program code directly -in the `lib.rs` file. This simplifies the explanations a bit. You’re welcome to +For the rest of this Demo, we'll be making updates to the program code directly +in the `lib.rs` file. This simplifies the explanations a bit. You're welcome to modify the structure as you will. Feel free to build before continuing. This ensures your environment is working @@ -698,7 +698,7 @@ properly and shortens future build times. #### 2. Define `Note` schema -Next, we’re going to define what a note looks like within our program. Notes +Next, we're going to define what a note looks like within our program. Notes should have the following properties: - `leaf_node` - this should be a 32-byte array representing the hash stored on @@ -723,15 +723,15 @@ impl NoteLog { ``` In a traditional Anchor program, this would be an account struct, but since -we’re using state compression, our accounts won’t be mirroring our native -structures. Since we don’t need all the functionality of an account, we can just +we're using state compression, our accounts won't be mirroring our native +structures. Since we don't need all the functionality of an account, we can just use the `AnchorSerialize` derive macro rather than the `account` macro. #### 3. Define input accounts and constraints As luck would have it, every one of our instructions will be using the same -accounts. We’ll create a single `NoteAccounts` struct for our account -validation. It’ll need the following accounts: +accounts. We'll create a single `NoteAccounts` struct for our account +validation. It'll need the following accounts: - `owner` - this is the creator and owner of the note; should be a signer on the transaction @@ -771,7 +771,7 @@ pub struct NoteAccounts<'info> { #### 4. Create `create_note_tree` instruction -Next, let’s create our `create_note_tree` instruction. Remember, clients will +Next, let's create our `create_note_tree` instruction. Remember, clients will have already allocated the Merkle tree account but will use this instruction to initialize it. @@ -834,25 +834,25 @@ and the tree authority bump. #### 5. Create `append_note` instruction -Now, let’s create our `append_note` instruction. This instruction needs to take -the raw note as a String and compress it into a hash that we’ll store on the -Merkle tree. We’ll also log the note to the Noop program so the entirety of the -data exists within the chain’s state. +Now, let's create our `append_note` instruction. This instruction needs to take +the raw note as a String and compress it into a hash that we'll store on the +Merkle tree. We'll also log the note to the Noop program so the entirety of the +data exists within the chain's state. The steps here are as follows: 1. Use the `hashv` function from the `keccak` crate to hash the note and owner, - each as their corresponding byte representation. It’s **_crucial_** that you - hash the owner as well as the note. This is how we’ll verify note ownership + each as their corresponding byte representation. It's **_crucial_** that you + hash the owner as well as the note. This is how we'll verify note ownership before updates in the update instruction. 2. Create an instance of the `NoteLog` struct using the hash from step 1, the - owner’s public key, and the raw note as a String. Then call + owner's public key, and the raw note as a String. Then call `wrap_application_data_v1` to issue a CPI to the Noop program, passing the instance of `NoteLog`. This ensures the entirety of the note (not just the hash) is readily available to any client looking for it. For broad use cases like cNFTs, that would be indexers. You might create your observing client to simulate what indexers are doing but for your own application. -3. Build and issue a CPI to the State Compression Program’s `append` +3. Build and issue a CPI to the State Compression Program's `append` instruction. This takes the hash computed in step 1 and adds it to the next available leaf on your Merkle tree. Just as before, this requires the Merkle tree address and the tree authority bump as signature seeds. @@ -901,14 +901,14 @@ pub mod compressed_notes { #### 6. Create `update_note` instruction -The last instruction we’ll make is the `update_note` instruction. This should +The last instruction we'll make is the `update_note` instruction. This should replace an existing leaf with a new hash representing the new updated note data. -For this to work, we’ll need the following parameters: +For this to work, we'll need the following parameters: 1. `index` - the index of the leaf we are going to update 2. `root` - the root hash of the Merkle tree -3. `old_note` - the string representation of the old note we’re updating +3. `old_note` - the string representation of the old note we're updating 4. `new_note` - the string representation of the new note we want to update to Remember, the steps here are similar to `append_note`, but with some minor @@ -916,22 +916,22 @@ additions and modifications: 1. The first step is new. We need to first prove that the `owner` calling this function is the true owner of the leaf at the given index. Since the data is - compressed as a hash on the leaf, we can’t simply compare the `owner` public + compressed as a hash on the leaf, we can't simply compare the `owner` public key to a stored value. Instead, we need to compute the previous hash using the old note data and the `owner` listed in the account validation struct. We - then build and issue a CPI to the State Compression Program’s `verify_leaf` + then build and issue a CPI to the State Compression Program's `verify_leaf` instruction using our computed hash. 2. This step is the same as the first step from creating the `append_note` instruction. Use the `hashv` function from the `keccak` crate to hash the new note and its owner, each as their corresponding byte representation. 3. This step is the same as the second step from creating the `append_note` instruction. Create an instance of the `NoteLog` struct using the hash from - step 2, the owner’s public key, and the new note as a string. Then call + step 2, the owner's public key, and the new note as a string. Then call `wrap_application_data_v1` to issue a CPI to the Noop program, passing the instance of `NoteLog` 4. This step is slightly different than the last step from creating the `append_note` instruction. Build and issue a CPI to the State Compression - Program’s `replace_leaf` instruction. This uses the old hash, the new hash, + Program's `replace_leaf` instruction. This uses the old hash, the new hash, and the leaf index to replace the data of the leaf at the given index with the new hash. Just as before, this requires the Merkle tree address and the tree authority bump as signature seeds. @@ -1009,19 +1009,19 @@ pub mod compressed_notes { #### 7. Client test setup -We’re going to write a few tests to ensure that our program works as expected. -First, let’s do some setup. +We're going to write a few tests to ensure that our program works as expected. +First, let's do some setup. -We’ll be using the `@solana/spl-account-compression` package. Go ahead and +We'll be using the `@solana/spl-account-compression` package. Go ahead and install it: ```bash yarn add @solana/spl-account-compression ``` -Next, we’re going to give you the contents of a utility file we’ve created to +Next, we're going to give you the contents of a utility file we've created to make testing easier. Create a `utils.ts` file in the `tests` directory, add in -the below, then we’ll explain it. +the below, then we'll explain it. ```typescript import { @@ -1132,21 +1132,21 @@ export async function getNoteLog(connection: Connection, txSignature: string) { There are 3 main things in the above file: -1. `NoteLog` - a class representing the note log we’ll find in the Noop program - logs. We’ve also added the borsh schema as `NoteLogBorshSchema` for +1. `NoteLog` - a class representing the note log we'll find in the Noop program + logs. We've also added the borsh schema as `NoteLogBorshSchema` for deserialization. 2. `getHash` - a function that creates a hash of the note and note owner so we can compare it to what we find on the Merkle tree -3. `getNoteLog` - a function that looks through the provided transaction’s logs, +3. `getNoteLog` - a function that looks through the provided transaction's logs, finds the Noop program logs, then deserializes and returns the corresponding Note log. #### 8. Write client tests -Now that we’ve got our packages installed and utility file ready, let’s dig into -the tests themselves. We’re going to create four of them: +Now that we've got our packages installed and utility file ready, let's dig into +the tests themselves. We're going to create four of them: -1. Create Note Tree - this will create the Merkle tree we’ll be using to store +1. Create Note Tree - this will create the Merkle tree we'll be using to store note hashes 2. Add Note - this will call our `append_note` instruction 3. Add Max Size Note - this will call our `append_note` instruction with a note @@ -1154,11 +1154,11 @@ the tests themselves. We’re going to create four of them: 4. Update First Note - this will call our `update_note` instruction to modify the first note we added -The first test is mostly just for setup. In the last three tests, we’ll be +The first test is mostly just for setup. In the last three tests, we'll be asserting each time that the note hash on the tree matches what we would expect given the note text and signer. -Let’s start with our imports. There are quite a few from Anchor, +Let's start with our imports. There are quite a few from Anchor, `@solana/web3.js`, `@solana/spl-account-compression`, and our own utils file. ```typescript @@ -1183,7 +1183,7 @@ import { getHash, getNoteLog } from "./utils"; import { assert } from "chai"; ``` -Next, we’ll want to set up the state variables we’ll be using throughout our +Next, we'll want to set up the state variables we'll be using throughout our tests. This includes the default Anchor setup as well as generating a Merkle tree keypair, the tree authority, and some notes. @@ -1217,12 +1217,12 @@ describe("compressed-notes", () => { }); ``` -Finally, let’s start with the tests themselves. First the `Create Note Tree` +Finally, let's start with the tests themselves. First the `Create Note Tree` test. This test will do two things: 1. Allocate a new account for the Merkle tree with a max depth of 3, max buffer size of 8, and canopy depth of 0 -2. Initialize this new account using our program’s `createNoteTree` instruction +2. Initialize this new account using our program's `createNoteTree` instruction ```typescript it("Create Note Tree", async () => { @@ -1258,7 +1258,7 @@ it("Create Note Tree", async () => { }); ``` -Next, we’ll create the `Add Note` test. It should call `append_note` with +Next, we'll create the `Add Note` test. It should call `append_note` with `firstNote`, then check that the onchain hash matches our computed hash and that the note log matches the text of the note we passed into the instruction. @@ -1282,7 +1282,7 @@ it("Add Note", async () => { }); ``` -Next, we’ll create the `Add Max Size Note` test. It is the same as the previous +Next, we'll create the `Add Max Size Note` test. It is the same as the previous test, but with the second note. ```typescript @@ -1306,14 +1306,14 @@ it("Add Max Size Note", async () => { }); ``` -Lastly, we’ll create the `Update First Note` test. This is slightly more complex -than adding a note. We’ll do the following: +Lastly, we'll create the `Update First Note` test. This is slightly more complex +than adding a note. We'll do the following: -1. Get the Merkle tree root as it’s required by the instruction. +1. Get the Merkle tree root as it's required by the instruction. 2. Call the `update_note` instruction of our program, passing in the index 0 (for the first note), the Merkle tree root, the first note, and the updated data. Remember, it needs the first note and the root because the program must - verify the entire proof path for the note’s leaf before it can be updated. + verify the entire proof path for the note's leaf before it can be updated. ```typescript it("Update First Note", async () => { @@ -1344,19 +1344,19 @@ it("Update First Note", async () => { }); ``` -That’s it, congrats! Go ahead and run `anchor test` and you should get four +That's it, congrats! Go ahead and run `anchor test` and you should get four passing tests. -If you’re running into issues, feel free to go back through some of the demo or +If you're running into issues, feel free to go back through some of the demo or look at the full solution code in the [Compressed Notes repository](https://github.com/unboxed-software/anchor-compressed-notes). ## Challenge -Now that you’ve practiced the basics of state compression, add a new instruction +Now that you've practiced the basics of state compression, add a new instruction to the Compressed Notes program. This new instruction should allow users to -delete an existing note. keep in mind that you can’t remove a leaf from the -tree, so you’ll need to decide what “deleted” looks like for your program. Good +delete an existing note. keep in mind that you can't remove a leaf from the +tree, so you'll need to decide what “deleted” looks like for your program. Good luck! If you'd like a very simple example of a delete function, check out the diff --git a/content/courses/token-extensions/close-mint.md b/content/courses/token-extensions/close-mint.md index f2751b852..42a0df6eb 100644 --- a/content/courses/token-extensions/close-mint.md +++ b/content/courses/token-extensions/close-mint.md @@ -290,7 +290,7 @@ the local RPC URL. const connection = new Connection("http://127.0.0.1:8899", "confirmed"); ``` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript diff --git a/content/courses/token-extensions/default-account-state.md b/content/courses/token-extensions/default-account-state.md index baff06b7a..c0a19a134 100644 --- a/content/courses/token-extensions/default-account-state.md +++ b/content/courses/token-extensions/default-account-state.md @@ -296,7 +296,7 @@ the local RPC URL. const connection = new Connection("http://127.0.0.1:8899", "confirmed"); ``` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript @@ -647,7 +647,7 @@ esrun src/index.ts #### 7.3 Transferring without thawing the recipient's account -Now that we’ve tested minting, we can test transferring our tokens frozen and +Now that we've tested minting, we can test transferring our tokens frozen and not. First lets test a transfer without thawing the recipient's token account. Remember, by default, the `otherTokenAccountKeypair` is frozen due to the extension. @@ -746,7 +746,7 @@ Remember the key takeaways: accounts. - Frozen account's balance cannot change. -Congratulations! We’ve just created and tested a mint using the default account +Congratulations! We've just created and tested a mint using the default account extension! ## Challenge diff --git a/content/courses/token-extensions/immutable-owner.md b/content/courses/token-extensions/immutable-owner.md index b5342a53a..d5eb17192 100644 --- a/content/courses/token-extensions/immutable-owner.md +++ b/content/courses/token-extensions/immutable-owner.md @@ -182,7 +182,7 @@ the local RPC URL. const connection = new Connection("http://127.0.0.1:8899", "confirmed"); ``` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript @@ -360,7 +360,7 @@ const signature = await sendAndConfirmTransaction(connection, transaction, [ return signature; ``` -Now that we’ve added the functionality for `token-helper`, we can create our +Now that we've added the functionality for `token-helper`, we can create our test token accounts. One of the two test token accounts will be created by calling `createTokenAccountWithImmutableOwner`. The other will be created with the baked-in SPL helper function `createAssociatedTokenAccount`. This helper @@ -475,7 +475,7 @@ Now we can run `npx esrun src/index.ts`. This test should log a failure message similar to the one from the previous test. This means that both of our token accounts are in fact immutable and working as intended. -Congratulations! We’ve just created token accounts and tested the immutable +Congratulations! We've just created token accounts and tested the immutable owner extension! If you are stuck at any point, you can find the working code on the `solution` branch of [this repository](https://github.com/Unboxed-Software/solana-lab-immutable-owner/tree/solution). diff --git a/content/courses/token-extensions/interest-bearing-token.md b/content/courses/token-extensions/interest-bearing-token.md index ba44e30ad..29cfa7735 100644 --- a/content/courses/token-extensions/interest-bearing-token.md +++ b/content/courses/token-extensions/interest-bearing-token.md @@ -542,7 +542,7 @@ Now run `npx esrun src/index.ts`. This is expected to fail and log out **Mint tokens and read interest rate** -So we’ve tested updating the interest rate. How do we check that the accrued +So we've tested updating the interest rate. How do we check that the accrued interest increases when an account mints more tokens? We can use the `amountToUiAmount` and `getAccount` helpers from the SPL library to help us achieve this. @@ -690,7 +690,7 @@ try { This is expected to work and the new interest rate should be 10. -Thats it! We’ve just created an interest bearing token, updated the interest +Thats it! We've just created an interest bearing token, updated the interest rate and logged the updated state of the token! ## Challenge diff --git a/content/courses/token-extensions/non-transferable-token.md b/content/courses/token-extensions/non-transferable-token.md index f9f2d4c43..c62ceceb0 100644 --- a/content/courses/token-extensions/non-transferable-token.md +++ b/content/courses/token-extensions/non-transferable-token.md @@ -303,7 +303,7 @@ esrun src/index.ts ``` The non-transferable mint has been set up correctly and will be created when we -run `npm start`. Let’s move on to the next step and create a source account and +run `npm start`. Let's move on to the next step and create a source account and mint a token to it. #### 4. Mint token diff --git a/content/courses/token-extensions/permanent-delegate.md b/content/courses/token-extensions/permanent-delegate.md index bbc26b9a1..5576ca0af 100644 --- a/content/courses/token-extensions/permanent-delegate.md +++ b/content/courses/token-extensions/permanent-delegate.md @@ -338,7 +338,7 @@ the local RPC URL. const connection = new Connection("http://127.0.0.1:8899", "confirmed"); ``` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript diff --git a/content/courses/token-extensions/required-memo.md b/content/courses/token-extensions/required-memo.md index c5f5decf3..43bab1061 100644 --- a/content/courses/token-extensions/required-memo.md +++ b/content/courses/token-extensions/required-memo.md @@ -242,7 +242,7 @@ the local RPC URL. `const connection = new Connection("http://127.0.0.1:8899", "confirmed");` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript @@ -600,7 +600,7 @@ extension. npx esrun src/index.ts ``` -Congratulations! We’ve just tested the required memo extension! +Congratulations! We've just tested the required memo extension! ## Challenge diff --git a/content/courses/token-extensions/transfer-fee.md b/content/courses/token-extensions/transfer-fee.md index f981c5e98..16cd7ff93 100644 --- a/content/courses/token-extensions/transfer-fee.md +++ b/content/courses/token-extensions/transfer-fee.md @@ -535,7 +535,7 @@ the local RPC URL. const connection = new Connection("http://127.0.0.1:8899", "confirmed"); ``` -Alternatively, if you’d like to use testnet or devnet, import the +Alternatively, if you'd like to use testnet or devnet, import the `clusterApiUrl` from `@solana/web3.js` and pass it to the connection as such: ```typescript diff --git a/content/courses/tokens-and-nfts/token-program.md b/content/courses/tokens-and-nfts/token-program.md index 08794cf32..feb79adc9 100644 --- a/content/courses/tokens-and-nfts/token-program.md +++ b/content/courses/tokens-and-nfts/token-program.md @@ -402,7 +402,7 @@ async function buildMintToTransaction( SPL Token transfers require both the sender and receiver to have token accounts for the mint of the tokens being transferred. The tokens are transferred from -the sender’s token account to the receiver’s token account. +the sender's token account to the receiver's token account. You can use `getOrCreateAssociatedTokenAccount` when obtaining the receiver's associated token account to ensure their token account exists before the @@ -457,7 +457,7 @@ async function buildTransferTransaction( ### Lab -We’re going to use the Token Program to create a Token Mint, create an +We're going to use the Token Program to create a Token Mint, create an Associated Token Account, mint tokens, transfer tokens, and burn tokens. Assuming you already have a `.env` file with a `SECRET_KEY` setup per @@ -838,7 +838,7 @@ balance go up! ### Challenge -Now it’s your turn to build something independently. Create an application that +Now it's your turn to build something independently. Create an application that allows a user to create a new mint, create a token account, and mint tokens. To interact with the Token Program using the wallet adapter, you will have to diff --git a/content/guides/advanced/stake-weighted-qos.md b/content/guides/advanced/stake-weighted-qos.md index 7fe090140..3a59738a8 100644 --- a/content/guides/advanced/stake-weighted-qos.md +++ b/content/guides/advanced/stake-weighted-qos.md @@ -106,7 +106,7 @@ Stake-weighted QoS will not work unless BOTH sides are properly configured. ### Configuring the Validator node -On the validator, you’ll have to enable +On the validator, you'll have to enable `--staked-nodes-overrides /path/to/overrides.yml`. The `--staked-nodes-overrides` flag helps the validator prioritize transactions being sent from known sources to apply stake to their transactions. This can @@ -114,7 +114,7 @@ help a validator prioritize certain transactions over known hosts over others, enabling the usage of Stake-weighted QoS with RPCs. RPCs should not be staked in any way. -Today, Stake-weighted QoS gives a stake-weighted priority to 80% of a leader’s +Today, Stake-weighted QoS gives a stake-weighted priority to 80% of a leader's TPU capacity. However, there are configuration options which can be used to virtually assign different stake-weights to TPU peers, including assigning unstaked peers virtual stake. @@ -130,7 +130,7 @@ staked_map_id: `staked_map_id` contains a map of identity public key to the stake amount in lamports to apply to each RPC. When set, the validator will prioritize QUIC connections with the RPC found at that identity publicKey, assigning an amount -of stake to their transactions. The 80% of the leader’s TPU capacity will be +of stake to their transactions. The 80% of the leader's TPU capacity will be split proportionally based on the lamport amounts specified in the `staked-nodes-overrides` file and existing cluster stake. diff --git a/content/guides/games/hello-world.md b/content/guides/games/hello-world.md index 9befc65a6..23e4b4bab 100644 --- a/content/guides/games/hello-world.md +++ b/content/guides/games/hello-world.md @@ -195,7 +195,7 @@ Alternatively, you can use the signer's address as an extra seed in the ### Move Left Instruction -Now that we can initialize a `GameDataAccount` account, let’s implement the +Now that we can initialize a `GameDataAccount` account, let's implement the `move_left` instruction which allows a player update their `player_position`. In this example, moving left simply means decrementing the `player_position` @@ -233,8 +233,8 @@ pub struct MoveLeft<'info> { ### Move Right Instruction -Lastly, let’s implement the `move_right` instruction. Similarly, moving right -will simply mean incrementing the `player_position` by 1. We’ll also limit the +Lastly, let's implement the `move_right` instruction. Similarly, moving right +will simply mean incrementing the `player_position` by 1. We'll also limit the maximum position to 3. Just like before, the only account needed for this instruction is the @@ -381,7 +381,7 @@ file and add the code snippets from the following sections. ### Derive the GameDataAccount Account Address -First, let’s derive the PDA for the `GameDataAccount` using the +First, let's derive the PDA for the `GameDataAccount` using the `findProgramAddress` function. > A [Program Derived Address (PDA)](/docs/core/pda.md) is unique address in the @@ -398,7 +398,7 @@ const [globalLevel1GameDataAccount, bump] = ### Initialize the Game State -Next, let’s try to fetch the game data account using the PDA from the previous +Next, let's try to fetch the game data account using the PDA from the previous step. If the account doesn't exist, we'll create it by invoking the `initialize` instruction from our program. @@ -460,8 +460,8 @@ console.log("Player position is:", gameDateAccount.playerPosition.toString()); ### Logging the Player's Position -Lastly, let’s use a `switch` statement to log the character's position based on -the `playerPosition` value stored in the `gameDateAccount`. We’ll use this as a +Lastly, let's use a `switch` statement to log the character's position based on +the `playerPosition` value stored in the `gameDateAccount`. We'll use this as a visual representation of the character's movement in the game. ```ts filename="client.ts" diff --git a/content/guides/games/interact-with-tokens.md b/content/guides/games/interact-with-tokens.md index 3e74748b6..d36375972 100644 --- a/content/guides/games/interact-with-tokens.md +++ b/content/guides/games/interact-with-tokens.md @@ -88,13 +88,13 @@ pub mod anchor_token { ``` Here we are simply bringing into scope the crates and corresponding modules we -will be using for this program. We’ll be using the `anchor_spl` and +will be using for this program. We'll be using the `anchor_spl` and `mpl_token_metadata` crates to help us interact with the SPL Token program and Metaplex's Token Metadata program. ## Create Mint instruction -First, let’s implement an instruction to create a new token mint and its +First, let's implement an instruction to create a new token mint and its metadata account. The on-chain token metadata, including the name, symbol, and URI, will be provided as parameters to the instruction. @@ -107,7 +107,7 @@ The `create_mint` instruction requires the following accounts: - `admin` - the `ADMIN_PUBKEY` that signs the transaction and pays for the initialization of the accounts - `reward_token_mint` - the new token mint we are initializing, using a PDA as - both the mint account’s address and its mint authority + both the mint account's address and its mint authority - `metadata_account` - the metadata account we are initializing for the token mint - `token_program` - required for interacting with instructions on the Token @@ -295,7 +295,7 @@ health by 10 and mints 1 token to the player's token account as a reward. The `kill_enemy` instruction requires the following accounts: - `player` - the player receiving the token -- `player_data` - the player data account storing the player’s current health +- `player_data` - the player data account storing the player's current health - `player_token_account` - the player's associated token account where tokens will be minted - `reward_token_mint` - the token mint account, specifying the type of token @@ -389,7 +389,7 @@ pub enum ErrorCode { ``` The player's health is reduced by 10 to represent the “battle with the enemy”. -We’ll also check the player's current health and return a custom Anchor error if +We'll also check the player's current health and return a custom Anchor error if the player has 0 health. The instruction then uses a cross-program invocation (CPI) to call the `mint_to` @@ -409,7 +409,7 @@ token and restore their health to its maximum value. The `heal` instruction requires the following accounts: - `player` - the player executing the healing action -- `player_data` - the player data account storing the player’s current health +- `player_data` - the player data account storing the player's current health - `player_token_account` - the player's associated token account where the tokens will be burned - `reward_token_mint` - the token mint account, specifying the type of token diff --git a/content/guides/games/store-sol-in-pda.md b/content/guides/games/store-sol-in-pda.md index a024af0f7..d472df7f7 100644 --- a/content/guides/games/store-sol-in-pda.md +++ b/content/guides/games/store-sol-in-pda.md @@ -180,7 +180,7 @@ displays the starting message. The `initialize_level_one` instruction requires 4 accounts: - `new_game_data_account` - the `GameDataAccount` we are initializing to store - the player’s position + the player's position - `chest_vault` - the `ChestVaultAccount` we are initializing to store the SOL reward - `signer` - the player paying for the initialization of the accounts diff --git a/content/guides/getstarted/cosmwasm-to-solana.md b/content/guides/getstarted/cosmwasm-to-solana.md index 5709298fc..04666aa67 100644 --- a/content/guides/getstarted/cosmwasm-to-solana.md +++ b/content/guides/getstarted/cosmwasm-to-solana.md @@ -408,7 +408,7 @@ pub fn process_reset( ## Solana Program Advantages 1. Performance Efficiency: - - Solana’s binary instruction data and direct account manipulation provide + - Solana's binary instruction data and direct account manipulation provide high performance and low latency. - This is critical for high-throughput applications like decentralized exchanges (DEXes) and other performance-sensitive use cases. @@ -424,6 +424,6 @@ pub fn process_reset( specialized logic. In conclusion, Solana is ideal for applications that require high performance, -low latency, and fine-grained control over execution. It’s better suited for +low latency, and fine-grained control over execution. It's better suited for developers comfortable with lower-level programming and those who need to optimize for specific use cases. diff --git a/content/guides/getstarted/intro-to-anchor.md b/content/guides/getstarted/intro-to-anchor.md index 3012c23cb..7b098c196 100644 --- a/content/guides/getstarted/intro-to-anchor.md +++ b/content/guides/getstarted/intro-to-anchor.md @@ -27,7 +27,7 @@ and abstractions that make building Solana programs more intuitive and secure. The main macros found in an Anchor program include: - [`declare_id`](#declare_id-macro): Specifies the program's on-chain address -- [`#[program]`](#program-macro): Specifies the module containing the program’s +- [`#[program]`](#program-macro): Specifies the module containing the program's instruction logic - [`#[derive(Accounts)]`](#derive-accounts-macro): Applied to structs to indicate a list of accounts required for an instruction diff --git a/content/guides/getstarted/local-rust-hello-world.md b/content/guides/getstarted/local-rust-hello-world.md index 864c80399..88b4b8692 100644 --- a/content/guides/getstarted/local-rust-hello-world.md +++ b/content/guides/getstarted/local-rust-hello-world.md @@ -306,7 +306,7 @@ await connection.confirmTransaction({ }); console.log( - `Congratulations! Look at your ‘Hello World’ transaction in the Solana Explorer: + `Congratulations! Look at your ‘Hello World' transaction in the Solana Explorer: https://explorer.solana.com/tx/${txHash}?cluster=custom`, ); ``` @@ -325,7 +325,7 @@ node client.mjs You should see the following output: ```shell -Congratulations! Look at your ‘Hello World’ transaction in the Solana Explorer: +Congratulations! Look at your ‘Hello World' transaction in the Solana Explorer: https://explorer.solana.com/tx/2fTcQ74z4DVi8WRuf2oNZ36z7k9tGRThaRPXBMYgjMUNUbUSKLrP6djpRUZ8msuTXvZHFe3UXi31dfgytG2aJZbv?cluster=custom ``` diff --git a/content/guides/getstarted/rust-to-solana.md b/content/guides/getstarted/rust-to-solana.md index 883f372c8..3c001d31b 100644 --- a/content/guides/getstarted/rust-to-solana.md +++ b/content/guides/getstarted/rust-to-solana.md @@ -35,13 +35,13 @@ need to know to start their Solana journeys. ## Understanding the Core Differences First, note that this guide aims at understanding the differences in using Rust -as a language when working with Solana. It won’t cover +as a language when working with Solana. It won't cover [Blockchain or Solana basics](https://solana.com/learn/blockchain-basics). -It also won’t cover core Solana concepts that must be understood in order to +It also won't cover core Solana concepts that must be understood in order to program in Solana, such as: -- [Programs](https://solana.com/docs/core/programs) - Solana’s version of smart +- [Programs](https://solana.com/docs/core/programs) - Solana's version of smart contracts - [Accounts](https://solana.com/docs/core/accounts) - A record in the Solana ledger that either holds data (a data account) or is an executable program @@ -53,7 +53,7 @@ program in Solana, such as: For more information on those core concepts, check out the [Solana developer documentation](https://solana.com/docs). -Let’s now look at the differences in **project setup**. +Let's now look at the differences in **project setup**. ## Key Setup Details @@ -152,7 +152,7 @@ Using an additional crate that depends on `rand` will also cause compile errors. However, if the crate used simply depends on `rand` but does not actually generate random numbers, then it is possible to work around this by adding the -following to the program’s Cargo.toml: +following to the program's Cargo.toml: ```toml [dependencies] @@ -222,7 +222,7 @@ allows developers to develop and deploy Solana programs. ![Solana Playground](/assets/guides/rust-to-solana/solana-playground.png) -It’s the easiest way to begin developing with Solana, and it supports building, +It's the easiest way to begin developing with Solana, and it supports building, testing, and deploying Solana Rust programs. Additionally, a number of built-in tutorials are available to guide learning. @@ -247,7 +247,7 @@ and then use `anchor init ` to create a new Anchor project. ## Creating offchain Programs So far, this guide has covered the key details of developing **onchain Solana -programs** in Rust. However, it’s also possible to develop **offchain Solana +programs** in Rust. However, it's also possible to develop **offchain Solana clients** in Rust. This can be done by using the [solana_sdk crate](https://docs.rs/solana-sdk/latest/solana_sdk/). This contains the [solana_client crate](https://docs.rs/solana-client/latest/solana_client/) diff --git a/content/guides/getstarted/scaffold-nextjs-anchor.md b/content/guides/getstarted/scaffold-nextjs-anchor.md index 5adfef9a4..95729eb8d 100644 --- a/content/guides/getstarted/scaffold-nextjs-anchor.md +++ b/content/guides/getstarted/scaffold-nextjs-anchor.md @@ -57,7 +57,7 @@ If you haven't installed Solana CLI, Rust, or Anchor before, you can easily do so by [following our helpful installation guide](https://solana.com/docs/intro/installation) -> This scaffolds only supports TypeScript for now, but don’t worry, TypeScript +> This scaffolds only supports TypeScript for now, but don't worry, TypeScript > simply extends on the JavaScript you already know to add helpful type > definitions. diff --git a/content/guides/getstarted/solana-test-validator.md b/content/guides/getstarted/solana-test-validator.md index c69046e2c..a2dd52058 100644 --- a/content/guides/getstarted/solana-test-validator.md +++ b/content/guides/getstarted/solana-test-validator.md @@ -76,7 +76,7 @@ Once you have the `solana-test-validator` up and running, you can interact with it using various Solana CLI (Command Line Interface) commands. These commands let you [deploy programs](/docs/programs/deploying.md), manage [accounts](/docs/core/accounts.md), send -[transactions](/docs/core/transactions.md), and much more. Here’s a detailed +[transactions](/docs/core/transactions.md), and much more. Here's a detailed guide on the key commands you will use. ### Checking the Status of the Test Validator diff --git a/content/guides/javascript/get-program-accounts.md b/content/guides/javascript/get-program-accounts.md index 3b750e21d..dedc4c497 100644 --- a/content/guides/javascript/get-program-accounts.md +++ b/content/guides/javascript/get-program-accounts.md @@ -144,7 +144,7 @@ token accounts that are owned by our wallet address. When looking at a token account, we can see the first two fields stored on a token account are both pubkeys, and that each pubkey is 32 bytes in length. Given that `owner` is the second field, we should begin our `memcmp` at an `offset` of 32 bytes. From -here, we’ll be looking for accounts whose owner field matches our wallet +here, we'll be looking for accounts whose owner field matches our wallet address. ![Account Size](/public/assets/guides/get-program-accounts/memcmp.png) @@ -217,7 +217,7 @@ Much like `memcmp`, `dataSlice` accepts two arguments: - `length`: The number of bytes which should be returned `dataSlice` is particularly useful when we run queries on a large dataset but -don’t actually care about the account data itself. An example of this would be +don't actually care about the account data itself. An example of this would be if we wanted to find the number of token accounts (i.e. number of token holders) for a particular token mint. @@ -301,7 +301,7 @@ Found 3 token account(s) for mint BUGuuhPsHpk8YZrL2GctsCtXGneL1gmT5zYb7eMHZDWf ``` By combining all three parameters (`dataSlice`, `dataSize`, and `memcmp`) we can -limit the scope of our query and efficiently return only the data we’re +limit the scope of our query and efficiently return only the data we're interested in. ## Other Resources diff --git a/content/guides/token-extensions/getting-started.md b/content/guides/token-extensions/getting-started.md index 67c2ac233..5e43e6a42 100644 --- a/content/guides/token-extensions/getting-started.md +++ b/content/guides/token-extensions/getting-started.md @@ -105,7 +105,7 @@ make sense to combine: - Confidential transfer + permanent delegate Other than these, you have the option to customize with any combination of token -extensions that suit your project’s needs. +extensions that suit your project's needs. ## How do I add custom logic to my tokens with token extensions? @@ -125,7 +125,7 @@ It is important to note that while transfer hooks give the capability to insert custom logic within a transfer, all accounts from the initial transfer are converted to read-only accounts. This means that the signer privileges of the sender do not extend to the Transfer Hook program. This is to avoid potential -unexpected logic executing on someone’s wallet who interacts with a token with +unexpected logic executing on someone's wallet who interacts with a token with transfer hooks, protecting the users. You can diff --git a/content/guides/token-extensions/transfer-hook.md b/content/guides/token-extensions/transfer-hook.md index b5b70559b..685850004 100644 --- a/content/guides/token-extensions/transfer-hook.md +++ b/content/guides/token-extensions/transfer-hook.md @@ -790,7 +790,7 @@ create_account( )?; ``` -Once we’ve created the account, we initialize the account data to store the list +Once we've created the account, we initialize the account data to store the list of ExtraAccountMetas. ```rust @@ -810,7 +810,7 @@ ExtraAccountMetas account. ### Custom Transfer Hook Instruction -Next, let’s implement the custom `transfer_hook` instruction. This is the +Next, let's implement the custom `transfer_hook` instruction. This is the instruction the Token Extension program will invoke on every token transfer. In this example, we will require a fee paid in wSOL for every token transfer. diff --git a/content/guides/wallets/add-solana-wallet-adapter-to-nextjs.md b/content/guides/wallets/add-solana-wallet-adapter-to-nextjs.md index f360fd035..e0e8f5d1c 100644 --- a/content/guides/wallets/add-solana-wallet-adapter-to-nextjs.md +++ b/content/guides/wallets/add-solana-wallet-adapter-to-nextjs.md @@ -148,7 +148,7 @@ import the provided standard CSS styles required for these react components to be displayed properly in our application. Each of these styles can be easily overridden to customize the look. -Let’s import these dependencies and use them further in the context/provider +Let's import these dependencies and use them further in the context/provider component we are building: ```tsx filename=AppWalletProvider.tsx @@ -340,7 +340,7 @@ side of your app that is a child of your `AppWalletAdapter` context. In this example guide, it will be your entire application. - the `useWallet` hook has details like `publicKey` and state of the wallet, - whether it’s `connecting` or it’s `connected`. + whether it's `connecting` or it's `connected`. - the `useConnection` hook will facilitate your application's connection to the Solana blockchain, via your RPC endpoint @@ -397,7 +397,7 @@ const getAirdropOnClick = async () => { ### Getting a wallet balance -Here’s an example of getting the SOL balance of the wallet connected using the +Here's an example of getting the SOL balance of the wallet connected using the `useConnection` and `useWallet` hooks. [`getBalance`](https://solana.com/docs/rpc/http/getbalance#parameters) is an RPC @@ -425,7 +425,7 @@ With functions like these and the ones provided within the wallet adapter packages, you can detect whether the user's wallet is connected or not, create a button to get an airdrop of devnet or SOL in the network defined, and more. -Let’s make another page now to demonstrate how we can use each of these hooks to +Let's make another page now to demonstrate how we can use each of these hooks to access actually access the `connection` object and your user's wallet state to send or sign transactions, read the wallet balance, and test functionality. diff --git a/content/workshops/solana-101.md b/content/workshops/solana-101.md index 1210c2ed8..72850d2b4 100644 --- a/content/workshops/solana-101.md +++ b/content/workshops/solana-101.md @@ -9,7 +9,7 @@ repoUrl: https://github.com/Solana-Workshops/solana-101 duration: "2 hours" objectives: - The Solana Network - - Solana’s Programming Model + - Solana's Programming Model - Tokens & NFTs tags: - Introduction @@ -36,7 +36,7 @@ authorGithubUsername: buffalojoec - Technical Advantages - Network Overview -#### Solana’s Programming Model +#### Solana's Programming Model - Accounts @@ -64,19 +64,19 @@ authorGithubUsername: buffalojoec ### Why Solana? -Let’s talk about the main technological advantages to building a decentralized +Let's talk about the main technological advantages to building a decentralized application on Solana. -Solana has extremely fast block confirmation times, so users don’t have to wait +Solana has extremely fast block confirmation times, so users don't have to wait to make sure their action worked. -Solana’s transaction fees are exceptionally low, so developers can build more +Solana's transaction fees are exceptionally low, so developers can build more robust user experiences that cost less. -Let’s take a brief look at how Solana’s network creates blocks and processes +Let's take a brief look at how Solana's network creates blocks and processes transactions. Like most proof-of-stake networks, Solana elects a leader for each block -creation cycle, who’s responsible for creating a new block. +creation cycle, who's responsible for creating a new block. Unlike Ethereum - Solana does not use a mempool. Instead, it forwards new transactions to the next leader in the block creation cycle, which means when @@ -86,12 +86,12 @@ into a new block. Next, Solana leverages a high-throughput engine called Turbine that disseminates information about a new block to the rest of the network. -When a block’s transactions are executed, Solana’s runtime actually allows the +When a block's transactions are executed, Solana's runtime actually allows the operations within each transaction to run in parallel wherever possible. The combination of these 3 innovations leads to greatly increased speed and throughput for the network. -Solana’s most popular innovation is Proof-of-History, which leverages a +Solana's most popular innovation is Proof-of-History, which leverages a Verifiable-Delay Function (VDF) to allow all nodes in the network to agree on the passage of time. @@ -100,23 +100,23 @@ Weighted QoS, makes it perfect for high-performance applications. ### Programming on Solana -Now let’s dive into the concepts you’ll need to know when programming on Solana. -The first thing we’ll want to understand is the concept of an account. +Now let's dive into the concepts you'll need to know when programming on Solana. +The first thing we'll want to understand is the concept of an account. #### Account An account on Solana is a slice of data from the blockchain. Everything on Solana is an account! You can kind of think of it like a -computer’s file system - where everything is a file! +computer's file system - where everything is a file! Every account has a unique address, holds some balance of SOL, and can store arbitrary data. Based on the size of that arbitrary data, a user is required to -pay some value of SOL for what’s called “Rent”. +pay some value of SOL for what's called “Rent”. Since this is blockchain data, anyone can read from an account. Also, anyone can -credit SOL or tokens to an account. However, only an account’s owner can modify -its data - which includes debiting it’s SOL balance. +credit SOL or tokens to an account. However, only an account's owner can modify +its data - which includes debiting it's SOL balance. ``` { @@ -134,9 +134,9 @@ its data - which includes debiting it’s SOL balance. If we take a look at what an actual account looks like in raw form, we can see some of the fields present on all accounts shown here. -The “key” field is just that account’s address. +The “key” field is just that account's address. -The “lamports” field simply tracks that account’s current balance of SOL. +The “lamports” field simply tracks that account's current balance of SOL. Lamports are the smaller denomination of SOL. “Data” is where the arbitrary data is stored inside of an account. @@ -145,7 +145,7 @@ If that arbitrary data stored in this account is actually an executable program, the “is_executable” boolean will be set to true. Lastly, the “owner” field determines which Solana program has the authority to -perform changes to this account’s data, including its balance of Lamports. +perform changes to this account's data, including its balance of Lamports. #### Programs @@ -157,9 +157,9 @@ we mentioned before. Right now, Solana programs can be written in Rust, C/C++ or Python. Soon, we may be able to write programs in other languages - such as TypeScript and GoLang. -Unlike Ethereum’s “smart contracts”, programs don’t actually have state of their +Unlike Ethereum's “smart contracts”, programs don't actually have state of their own. Instead, they perform reads and writes on accounts from the blockchain. To -perform a write, this program must be the designated owner of the account it’s +perform a write, this program must be the designated owner of the account it's attempting to modify. Programs are designed to process what are called “instructions”, and they can also send these instructions to other programs on the network. diff --git a/docs/advanced/confirmation.md b/docs/advanced/confirmation.md index 98e56cb57..02cc91ad5 100644 --- a/docs/advanced/confirmation.md +++ b/docs/advanced/confirmation.md @@ -40,7 +40,7 @@ where the magic happens and at a high level it consists of four components: - a **list of accounts** to load, and - a **“recent blockhash.”** -In this article, we’re going to be focusing a lot on a transaction’s +In this article, we're going to be focusing a lot on a transaction's [recent blockhash](/docs/terminology.md#blockhash) because it plays a big role in transaction confirmation. @@ -65,14 +65,14 @@ touch on everything except steps 1 and 4. A [“blockhash”](/docs/terminology.md#blockhash) refers to the last Proof of History (PoH) hash for a [“slot”](/docs/terminology.md#slot) (description -below). Since Solana uses PoH as a trusted clock, a transaction’s recent +below). Since Solana uses PoH as a trusted clock, a transaction's recent blockhash can be thought of as a **timestamp**. ### Proof of History refresher -Solana’s Proof of History mechanism uses a very long chain of recursive SHA-256 +Solana's Proof of History mechanism uses a very long chain of recursive SHA-256 hashes to build a trusted clock. The “history” part of the name comes from the -fact that block producers hash transaction id’s into the stream to record which +fact that block producers hash transaction id's into the stream to record which transactions were processed in their block. [PoH hash calculation](https://github.com/anza-xyz/agave/blob/aa0922d6845e119ba466f88497e8209d1c82febc/entry/src/poh.rs#L79): @@ -123,7 +123,7 @@ the runtime. ### Example of transaction expiration -Let’s walk through a quick example: +Let's walk through a quick example: 1. A validator is actively producing a new block for the current slot 2. The validator receives a transaction from a user with the recent blockhash @@ -138,26 +138,26 @@ Let’s walk through a quick example: then starts producing the block for the next slot (validators get to produce blocks for 4 consecutive slots) 6. The validator checks that same transaction again and finds it is now 152 - blockhashes old and rejects it because it’s too old :( + blockhashes old and rejects it because it's too old :( ## Why do transactions expire? -There’s a very good reason for this actually, it’s to help validators avoid +There's a very good reason for this actually, it's to help validators avoid processing the same transaction twice. A naive brute force approach to prevent double processing could be to check -every new transaction against the blockchain’s entire transaction history. But +every new transaction against the blockchain's entire transaction history. But by having transactions expire after a short amount of time, validators only need to check if a new transaction is in a relatively small set of _recently_ processed transactions. ### Other blockchains -Solana’s approach of prevent double processing is quite different from other +Solana's approach of prevent double processing is quite different from other blockchains. For example, Ethereum tracks a counter (nonce) for each transaction sender and will only process transactions that use the next valid nonce. -Ethereum’s approach is simple for validators to implement, but it can be +Ethereum's approach is simple for validators to implement, but it can be problematic for users. Many people have encountered situations when their Ethereum transactions got stuck in a _pending_ state for a long time and all the later transactions, which used higher nonce values, were blocked from @@ -165,12 +165,12 @@ processing. ### Advantages on Solana -There are a few advantages to Solana’s approach: +There are a few advantages to Solana's approach: 1. A single fee payer can submit multiple transactions at the same time that are - allowed to be processed in any order. This might happen if you’re using + allowed to be processed in any order. This might happen if you're using multiple applications at the same time. -2. If a transaction doesn’t get committed to a block and expires, users can try +2. If a transaction doesn't get committed to a block and expires, users can try again knowing that their previous transaction will NOT ever be processed. By not using counters, the Solana wallet experience may be easier for users to @@ -181,7 +181,7 @@ quickly and avoid annoying pending states. Of course there are some disadvantages too: -1. Validators have to actively track a set of all processed transaction id’s to +1. Validators have to actively track a set of all processed transaction id's to prevent double processing. 2. If the expiration time period is too short, users might not be able to submit their transaction before it expires. @@ -189,7 +189,7 @@ Of course there are some disadvantages too: These disadvantages highlight a tradeoff in how transaction expiration is configured. If the expiration time of a transaction is increased, validators need to use more memory to track more transactions. If expiration time is -decreased, users don’t have enough time to submit their transaction. +decreased, users don't have enough time to submit their transaction. Currently, Solana clusters require that transactions use blockhashes that are no more than 151 blocks old. @@ -208,27 +208,27 @@ target time of 400ms. One minute is not a lot of time considering that a client needs to fetch a recent blockhash, wait for the user to sign, and finally hope that the -broadcasted transaction reaches a leader that is willing to accept it. Let’s go +broadcasted transaction reaches a leader that is willing to accept it. Let's go through some tips to help avoid confirmation failures due to transaction expiration! ### Fetch blockhashes with the appropriate commitment level -Given the short expiration time frame, it’s imperative that clients and +Given the short expiration time frame, it's imperative that clients and applications help users create transactions with a blockhash that is as recent as possible. When fetching blockhashes, the current recommended RPC API is called [`getLatestBlockhash`](/docs/rpc/http/getLatestBlockhash.mdx). By default, this API uses the `finalized` commitment level to return the most recently finalized -block’s blockhash. However, you can override this behavior by +block's blockhash. However, you can override this behavior by [setting the `commitment` parameter](/docs/rpc/index.mdx#configuring-state-commitment) to a different commitment level. **Recommendation** The `confirmed` commitment level should almost always be used for RPC requests -because it’s usually only a few slots behind the `processed` commitment and has +because it's usually only a few slots behind the `processed` commitment and has a very low chance of belonging to a dropped [fork](https://docs.solanalabs.com/consensus/fork-generation). @@ -237,10 +237,10 @@ But feel free to consider the other options: - Choosing `processed` will let you fetch the most recent blockhash compared to other commitment levels and therefore gives you the most time to prepare and process a transaction. But due to the prevalence of forking in the Solana - blockchain, roughly 5% of blocks don’t end up being finalized by the cluster - so there’s a real chance that your transaction uses a blockhash that belongs + blockchain, roughly 5% of blocks don't end up being finalized by the cluster + so there's a real chance that your transaction uses a blockhash that belongs to a dropped fork. Transactions that use blockhashes for abandoned blocks - won’t ever be considered recent by any blocks that are in the finalized + won't ever be considered recent by any blocks that are in the finalized blockchain. - Using the [default commitment](/docs/rpc#default-commitment) level `finalized` will eliminate any risk that the blockhash you choose will belong to a dropped @@ -259,22 +259,22 @@ into issues due to one node lagging behind the other. When RPC nodes receive a `sendTransaction` request, they will attempt to determine the expiration block of your transaction using the most recent finalized block or with the block selected by the `preflightCommitment` -parameter. A **VERY** common issue is that a received transaction’s blockhash +parameter. A **VERY** common issue is that a received transaction's blockhash was produced after the block used to calculate the expiration for that -transaction. If an RPC node can’t determine when your transaction expires, it +transaction. If an RPC node can't determine when your transaction expires, it will only forward your transaction **one time** and afterwards will then **drop** the transaction. Similarly, when RPC nodes receive a `simulateTransaction` request, they will simulate your transaction using the most recent finalized block or with the block selected by the `preflightCommitment` parameter. If the block chosen for -simulation is older than the block used for your transaction’s blockhash, the +simulation is older than the block used for your transaction's blockhash, the simulation will fail with the dreaded “blockhash not found” error. **Recommendation** Even if you use `skipPreflight`, **ALWAYS** set the `preflightCommitment` -parameter to the same commitment level used to fetch your transaction’s +parameter to the same commitment level used to fetch your transaction's blockhash for both `sendTransaction` and `simulateTransaction` requests. ### Be wary of lagging RPC nodes when sending transactions @@ -290,18 +290,18 @@ lagging behind the first. For `sendTransaction` requests, clients should keep resending a transaction to a RPC node on a frequent interval so that if an RPC node is slightly lagging -behind the cluster, it will eventually catch up and detect your transaction’s +behind the cluster, it will eventually catch up and detect your transaction's expiration properly. For `simulateTransaction` requests, clients should use the [`replaceRecentBlockhash`](/docs/rpc/http/simulateTransaction.mdx) parameter to -tell the RPC node to replace the simulated transaction’s blockhash with a +tell the RPC node to replace the simulated transaction's blockhash with a blockhash that will always be valid for simulation. ### Avoid reusing stale blockhashes Even if your application has fetched a very recent blockhash, be sure that -you’re not reusing that blockhash in transactions for too long. The ideal +you're not reusing that blockhash in transactions for too long. The ideal scenario is that a recent blockhash is fetched right before a user signs their transaction. @@ -309,19 +309,19 @@ transaction. Poll for new recent blockhashes on a frequent basis to ensure that whenever a user triggers an action that creates a transaction, your application already has -a fresh blockhash that’s ready to go. +a fresh blockhash that's ready to go. **Recommendation for wallets** -Poll for new recent blockhashes on a frequent basis and replace a transaction’s +Poll for new recent blockhashes on a frequent basis and replace a transaction's recent blockhash right before they sign the transaction to ensure the blockhash is as fresh as possible. ### Use healthy RPC nodes when fetching blockhashes By fetching the latest blockhash with the `confirmed` commitment level from an -RPC node, it’s going to respond with the blockhash for the latest confirmed -block that it’s aware of. Solana’s block propagation protocol prioritizes +RPC node, it's going to respond with the blockhash for the latest confirmed +block that it's aware of. Solana's block propagation protocol prioritizes sending blocks to staked nodes so RPC nodes naturally lag about a block behind the rest of the cluster. They also have to do more work to handle application requests and can lag a lot more under heavy user traffic. @@ -338,7 +338,7 @@ still return a blockhash that is just about to expire. Monitor the health of your RPC nodes to ensure that they have an up-to-date view of the cluster state with one of the following methods: -1. Fetch your RPC node’s highest processed slot by using the +1. Fetch your RPC node's highest processed slot by using the [`getSlot`](/docs/rpc/http/getSlot.mdx) RPC API with the `processed` commitment level and then call the [`getMaxShredInsertSlot](/docs/rpc/http/getMaxShredInsertSlot.mdx) RPC API to @@ -373,25 +373,25 @@ To start using durable transactions, a user first needs to submit a transaction that [invokes instructions that create a special on-chain “nonce” account](https://docs.rs/solana-program/latest/solana_program/system_instruction/fn.create_nonce_account.html) and stores a “durable blockhash” inside of it. At any point in the future (as -long as the nonce account hasn’t been used yet), the user can create a durable +long as the nonce account hasn't been used yet), the user can create a durable transaction by following these 2 rules: 1. The instruction list must start with an [“advance nonce” system instruction](https://docs.rs/solana-program/latest/solana_program/system_instruction/fn.advance_nonce_account.html) which loads their on-chain nonce account -2. The transaction’s blockhash must be equal to the durable blockhash stored by +2. The transaction's blockhash must be equal to the durable blockhash stored by the on-chain nonce account -Here’s how these durable transactions are processed by the Solana runtime: +Here's how these durable transactions are processed by the Solana runtime: -1. If the transaction’s blockhash is no longer “recent”, the runtime checks if - the transaction’s instruction list begins with an “advance nonce” system +1. If the transaction's blockhash is no longer “recent”, the runtime checks if + the transaction's instruction list begins with an “advance nonce” system instruction 2. If so, it then loads the nonce account specified by the “advance nonce” instruction -3. Then it checks that the stored durable blockhash matches the transaction’s +3. Then it checks that the stored durable blockhash matches the transaction's blockhash -4. Lastly it makes sure to advance the nonce account’s stored blockhash to the +4. Lastly it makes sure to advance the nonce account's stored blockhash to the latest recent blockhash to ensure that the same transaction can never be processed again diff --git a/docs/advanced/retry.md b/docs/advanced/retry.md index f69bddb8e..51e38f467 100644 --- a/docs/advanced/retry.md +++ b/docs/advanced/retry.md @@ -24,7 +24,7 @@ their own custom rebroadcasting logic. - Developers should enable preflight checks to raise errors before transactions are submitted - Before re-signing any transaction, it is **very important** to ensure that the - initial transaction’s blockhash has expired + initial transaction's blockhash has expired ## The Journey of a Transaction @@ -59,13 +59,13 @@ forwarding it to the relevant leaders. UDP allows validators to quickly communicate with one another, but does not provide any guarantees regarding transaction delivery. -Because Solana’s leader schedule is known in advance of every +Because Solana's leader schedule is known in advance of every [epoch](/docs/terminology.md#epoch) (~2 days), an RPC node will broadcast its transaction directly to the current and next leaders. This is in contrast to other gossip protocols such as Ethereum that propagate transactions randomly and broadly across the entire network. By default, RPC nodes will try to forward transactions to leaders every two seconds until either the transaction is -finalized or the transaction’s blockhash expires (150 blocks or ~1 minute 19 +finalized or the transaction's blockhash expires (150 blocks or ~1 minute 19 seconds as of the time of this writing). If the outstanding rebroadcast queue size is greater than [10,000 transactions](https://github.com/solana-labs/solana/blob/bfbbc53dac93b3a5c6be9b4b65f679fdb13e41d9/send-transaction-service/src/send_transaction_service.rs#L20), @@ -75,7 +75,7 @@ that RPC operators can adjust to change the default behavior of this retry logic. When an RPC node broadcasts a transaction, it will attempt to forward the -transaction to a leader’s +transaction to a leader's [Transaction Processing Unit (TPU)](https://github.com/solana-labs/solana/blob/cd6f931223181d5a1d47cba64e857785a175a760/core/src/validator.rs#L867). The TPU processes transactions in five distinct phases: @@ -105,7 +105,7 @@ For more information on the TPU, please refer to ## How Transactions Get Dropped -Throughout a transaction’s journey, there are a few scenarios in which the +Throughout a transaction's journey, there are a few scenarios in which the transaction can be unintentionally dropped from the network. ### Before a transaction is processed @@ -113,7 +113,7 @@ transaction can be unintentionally dropped from the network. If the network drops a transaction, it will most likely do so before the transaction is processed by a leader. UDP [packet loss](https://en.wikipedia.org/wiki/Packet_loss) is the simplest reason -why this might occur. During times of intense network load, it’s also possible +why this might occur. During times of intense network load, it's also possible for validators to become overwhelmed by the sheer number of transactions required for processing. While validators are equipped to forward surplus transactions via `tpu_forwards`, there is a limit to the amount of data that can @@ -127,7 +127,7 @@ There are also two lesser known reasons why a transaction may be dropped before it is processed. The first scenario involves transactions that are submitted via an RPC pool. Occasionally, part of the RPC pool can be sufficiently ahead of the rest of the pool. This can cause issues when nodes within the pool are required -to work together. In this example, the transaction’s +to work together. In this example, the transaction's [recentBlockhash](/docs/core/transactions.md#recent-blockhash) is queried from the advanced part of the pool (Backend A). When the transaction is submitted to the lagging part of the pool (Backend B), the nodes will not recognize the @@ -139,7 +139,7 @@ transaction submission if developers enable Temporarily network forks can also result in dropped transactions. If a validator is slow to replay its blocks within the Banking Stage, it may end up -creating a minority fork. When a client builds a transaction, it’s possible for +creating a minority fork. When a client builds a transaction, it's possible for the transaction to reference a `recentBlockhash` that only exists on the minority fork. After the transaction is submitted, the cluster can then switch away from its minority fork before the transaction is processed. In this @@ -150,7 +150,7 @@ scenario, the transaction is dropped due to the blockhash not being found. ### After a transaction is processed and before it is finalized In the event a transaction references a `recentBlockhash` from a minority fork, -it’s still possible for the transaction to be processed. In this case, however, +it's still possible for the transaction to be processed. In this case, however, it would be processed by the leader on the minority fork. When this leader attempts to share its processed transactions with the rest of the network, it would fail to reach consensus with the majority of validators that do not @@ -201,8 +201,8 @@ the transaction will be processed or finalized by the cluster. ## Customizing Rebroadcast Logic In order to develop their own rebroadcasting logic, developers should take -advantage of `sendTransaction`’s `maxRetries` parameter. If provided, -`maxRetries` will override an RPC node’s default retry logic, allowing +advantage of `sendTransaction`'s `maxRetries` parameter. If provided, +`maxRetries` will override an RPC node's default retry logic, allowing developers to manually control the retry process [within reasonable bounds](https://github.com/solana-labs/solana/blob/98707baec2385a4f7114d2167ef6dfb1406f954f/validator/src/main.rs#L1258-L1274). @@ -210,9 +210,9 @@ A common pattern for manually retrying transactions involves temporarily storing the `lastValidBlockHeight` that comes from [getLatestBlockhash](/docs/rpc/http/getLatestBlockhash.mdx). Once stashed, an application can then -[poll the cluster’s blockheight](/docs/rpc/http/getBlockHeight.mdx) and manually +[poll the cluster's blockheight](/docs/rpc/http/getBlockHeight.mdx) and manually retry the transaction at an appropriate interval. In times of network -congestion, it’s advantageous to set `maxRetries` to 0 and manually rebroadcast +congestion, it's advantageous to set `maxRetries` to 0 and manually rebroadcast via a custom algorithm. While some applications may employ an [exponential backoff](https://en.wikipedia.org/wiki/Exponential_backoff) algorithm, others such as [Mango](https://www.mango.markets/) opt to @@ -310,7 +310,7 @@ for, it is recommended that developers keep `skipPreflight` set to `false`. Despite all attempts to rebroadcast, there may be times in which a client is required to re-sign a transaction. Before re-signing any transaction, it is -**very important** to ensure that the initial transaction’s blockhash has +**very important** to ensure that the initial transaction's blockhash has expired. If the initial blockhash is still valid, it is possible for both transactions to be accepted by the network. To an end-user, this would appear as if they unintentionally sent the same transaction twice. diff --git a/docs/economics/index.md b/docs/economics/index.md index baf26baea..55444a801 100644 --- a/docs/economics/index.md +++ b/docs/economics/index.md @@ -8,7 +8,7 @@ sidebarSortOrder: 5 **Subject to change.** -Solana’s crypto-economic system is designed to promote a healthy, long term +Solana's crypto-economic system is designed to promote a healthy, long term self-sustaining economy with participant incentives aligned to the security and decentralization of the network. The main participants in this economy are validation-clients. Their contributions to the network, state validation, and diff --git a/docs/economics/inflation/terminology.md b/docs/economics/inflation/terminology.md index 56b1c8154..e8526d500 100644 --- a/docs/economics/inflation/terminology.md +++ b/docs/economics/inflation/terminology.md @@ -14,7 +14,7 @@ genesis block or protocol inflation) minus any tokens that have been burnt (via transaction fees or other mechanism) or slashed. At network launch, 500,000,000 SOL were instantiated in the genesis block. Since then the Total Current Supply has been reduced by the burning of transaction fees and a planned token -reduction event. Solana’s _Total Current Supply_ can be found at +reduction event. Solana's _Total Current Supply_ can be found at https://explorer.solana.com/supply ### Inflation Rate [%] @@ -52,7 +52,7 @@ _Inflation Schedule_. remaining fee retained by the validator that processes the transaction. - Additional factors such as loss of private keys and slashing events should also be considered in a holistic analysis of the _Effective Inflation Rate_. - For example, it’s estimated that $10-20\%$ of all BTC have been lost and are + For example, it's estimated that $10-20\%$ of all BTC have been lost and are unrecoverable and that networks may experience similar yearly losses at the rate of $1-2\%$. diff --git a/docs/intro/quick-start/cross-program-invocation.md b/docs/intro/quick-start/cross-program-invocation.md index 94eb78b60..5bac88834 100644 --- a/docs/intro/quick-start/cross-program-invocation.md +++ b/docs/intro/quick-start/cross-program-invocation.md @@ -562,7 +562,7 @@ Running tests... You can then inspect the SolanFM links to view the transaction details, where -you’ll find the CPIs for the transfer instructions within the update and delete +you'll find the CPIs for the transfer instructions within the update and delete instructions. ![Update CPI](/assets/docs/intro/quickstart/cpi-update.png) diff --git a/docs/more/exchange.md b/docs/more/exchange.md index a85268db4..2aae6ecff 100644 --- a/docs/more/exchange.md +++ b/docs/more/exchange.md @@ -767,7 +767,7 @@ curl https://api.devnet.solana.com -X POST -H "Content-Type: application/json" - ## Prioritization Fees and Compute Units -In periods of high demand, it’s possible for a transaction to expire before a +In periods of high demand, it's possible for a transaction to expire before a validator has included such transactions in their block because they chose other transactions with higher economic value. Valid Transactions on Solana may be delayed or dropped if Prioritization Fees are not implemented properly. @@ -817,7 +817,7 @@ may only return the lowest fee for each block. This will often be zero, which is not a fully useful approximation of what Prioritization Fee to use in order to avoid being rejected by validator nodes. -The `getRecentPrioritizationFees` API takes accounts’ pubkeys as parameters, and +The `getRecentPrioritizationFees` API takes accounts' pubkeys as parameters, and then returns the highest of the minimum prioritization fees for these accounts. When no account is specified, the API will return the lowest fee to land to block, which is usually zero (unless the block is full). diff --git a/docs/programs/testing.md b/docs/programs/testing.md index 029c24cfa..32d3bcdc3 100644 --- a/docs/programs/testing.md +++ b/docs/programs/testing.md @@ -18,7 +18,7 @@ There are two ways to test programs on Solana: 2. The various [BanksClient-based](https://docs.rs/solana-banks-client/latest/solana_banks_client/) test frameworks for SBF (Solana Bytecode Format) programs: Bankrun is a - framework that simulates a Solana bank’s operations, enabling developers to + framework that simulates a Solana bank's operations, enabling developers to deploy, interact with, and assess the behavior of programs under test conditions that mimic the mainnet. It helps set up the test environment and offers tools for detailed transaction insights, enhancing debugging and @@ -38,13 +38,13 @@ There are two ways to test programs on Solana: In this guide, we are using Solana Bankrun. `Bankrun` is a superfast, powerful, and lightweight framework for testing Solana programs in Node.js. -- The biggest advantage of using Solana Bankrun is that you don’t have to set +- The biggest advantage of using Solana Bankrun is that you don't have to set up - an environment to test programs like you’d have to do while using the + an environment to test programs like you'd have to do while using the `solana-test-validator`. Instead, you can do that with a piece of code, inside the tests. -- It also dynamically sets time and account data, which isn’t possible with +- It also dynamically sets time and account data, which isn't possible with `solana-test-validator` ## Installation @@ -66,7 +66,7 @@ directories: - `./tests/fixtures` (just create this directory if it doesn't exist already). - Your current working directory. - A directory you define in the `BPF_OUT_DIR` or `SBF_OUT_DIR` environment - variables. `export BPF_OUT_DIR=’/path/to/binary’` + variables. `export BPF_OUT_DIR='/path/to/binary'` - Build your program specifying the correct directory so that library can pick the file up from directory just from the name. `cargo build-sbf --manifest-path=./program/Cargo.toml --sbf-out-dir=./tests/fixtures` @@ -151,7 +151,7 @@ let transaction = await client.processTransaction(tx); ## Example -Here’s an example to write test for +Here's an example to write test for a [hello world program](https://github.com/solana-developers/program-examples/tree/main/basics/hello-solana/native) : ```typescript