I have several TB of borg backups. Uploaded them on backblaze b2. I could immediately see how much resources i was using, how many api calls, and so on. Very easy to see and predict the next bill. I can see exactly which bucket uses more resource, and which is growing over time.

Because I’m cheap, I want to upload those files on aws glacier, which theoretically costs a quarter of b2 for storage, but API calls are extremely expensive. So I want to know the details. I won’t like to get a bill with $5 in storage and $500 in API calls.

Uploaded a backup, but nowhere in AWS I can see how much resources i am using, how much I’m going to pay, how many API calls, how much the user XYZ spent, and so on.

It looks like it’s designed for an approach like “just use our product freely, don’t worry about pricing, it’s a problem for the financial department of your company”.

In AWS console I found “s3 storage lens”, but it says i need to delegate the access to someone else because reasons. Tried to create another user in my 1-user org, but after wasting 2 hours I wasn’t able to find a way to add those permissions.

Tried to create a dashboard in “AWS cost explorer” but all the indicators are null or zero.

So, how can I see how many API calls and storage is used, to predict the final bill? Or the only way is to pray and wait the end of the month and hopefully there everything it’s itemized in detail?

  • unchain@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    11 months ago

    If you’re cheap don’t use AWS. There are many ways things can get out of control and leave you with hefty bill to pay. The pricing structure is complicated because you pay for API calls and have multiple storage tiers. Overall AWS will be expensive compared to others. Other s3 compatible providers are more upfront with costs. Check out wasabi, idrive or blackblaze for price comparison.

  • ndguardian@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    You should be able to punch this info into the AWS cost calculator and see this info, right? I work with AWS on a daily basis for my day job and regularly have to pull these estimates for upcoming projects. Granted, these would be estimates.

    As for current costs, generally AWS lags by a couple hours to a day before costs show up in cost explorer, so not seeing them immediately isn’t too surprising.

  • DeltaTangoLima@reddrefuge.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    11 months ago

    As many others have said, AWS have a pricing calculator that lets you determine your likely costs.

    As a rough calc in the tool for us-east-2 (Ohio), if you PUT (a paid action) 1,000 objects per month of 1024MB each (1TB), and lifecycle transitioned all 1,000 objects each month into Glacier Deep Archive (another paid action), you’ll pay around $1.11USD per month. You pay nothing to transfer the data IN from the internet.

    Glacier Deep Archive is what I use for my backups. I have a 2N+C backup strategy, so I only ever intend to need to restore from these backups should both of my two local copies of my data are unavailable (eg. house fire). In that instance, I will pay a price for retrieval, as well as endure a waiting period.

  • Moonrise2473@feddit.itOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    For example, thanks to the helpful graphs at backblaze, I immediately noticed a lot of expensive “class C” api calls, which i minimized by optimizing the cronjob to account for the daily reset and telling rclone to avoid HEAD requests. So after just ta few days I noticed the problem and corrected it. If I did the same on AWS, I would have noticed it only at the end of the month, an expensive lesson.

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    3
    ·
    11 months ago

    i also use amazon, and i watch the costs like a hawk. it can explode quickly.

    if youre cost concerned, i would not recommend amazon. although a mature environment, youre paying for it.

    that said, the cost explorer is your friend. sorting by ‘usage type’ over time is what made it start working for me.

    i was also able to throw that metric into a default cloudfront dashboard.

    if you want serious details, you may need to do as they say and create the user, to access the required metrics.

    from recollection you need to create a user who can access the API required to grab the metrics you want. Even in their own system, this user needs to exist before they can show you metrics using their own api.

    i ran into similar security accessing my s3 bucket proceduraly.

  • eosph@lemmy.remotelab.uk
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    11 months ago

    The first screen you get to when you log in should show you your predicted monthly cost and current. Failing that you can use the Aws cost explorer to guesstimate how much

    edit:

    You can also go through the cost and billing manager in Aws.

    Edit edit!:

    You want to use glacier which is cool, pun very much intended. Plug your estimated usage into cost explorer and you should be good to go.

  • ChojinDSL@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 months ago

    Short answer, no. Nobody knows. At least not unless you can accurately predict exactly how many API calls and how much data you will transfer.

  • Moonrise2473@feddit.itOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    UPDATE: after some days, the bill under https://us-east-1.console.aws.amazon.com/billing/home?region=us-east-1#/bills is populated in much detail. Now it’s much clear.

    With rclone, my test sending 131 files / 2500 mb, set to don’t do upload chunking and don’t do HEAD on glacier created:

    • 110 PutObject requests to Glacier Deep Archive
    • 5 InitiateMultipartUpload requests
    • 5 CompleteMultipartUpload requests
    • 5 UploadPart requests
    • 192 PUT, COPY, POST, or LIST requests
    • 111 GET and all other requests

    I think now i can safely upload everything and it shouldn’t be too expensive