161
submitted 10 months ago by yogthos@lemmygrad.ml to c/news@hexbear.net

you are viewing a single comment's thread
view the rest of the comments
[-] dead@hexbear.net 16 points 10 months ago* (last edited 10 months ago)

This is the article published by KCNA where the photos came from. There are 43 photos included in the article. The website is hosted in DPRK.

http://kcna.kp/en/article/q/afd7ef74f27c9ed8d4836d99c3ab1ce6.kcmsf

https://archive.is/Q9VfU

all images mirrored

spoiler

  1. https://archive.is/VBwTW/956bbb45a9e8887596d977f6f309cf4f6219b8dd.jpg
  2. https://archive.is/VBwTW/4c49b992e3e7f6411ca03fb5f5002878e05b2642.jpg
  3. https://archive.is/VBwTW/ec990f40b2cda7babfdd98396e6e4b7d9d82c855.jpg
  4. https://archive.is/VBwTW/0ee4fceb6d0fd6c2ff33de960ec41f2da94cae94.jpg
  5. https://archive.is/VBwTW/a4b79934ea46f8537b0e13bd3107e89222bea0f8.jpg
  6. https://archive.is/VBwTW/242c5c101ca8b134ed2f3cc5ad1696e74e4e9a23.jpg
  7. https://archive.is/VBwTW/e64dd2dc1f235557521f04908f49e41342e7086c.jpg
  8. https://archive.is/VBwTW/8054035e89efef5e614ed576d8b0d335e3463909.jpg
  9. https://archive.is/VBwTW/2207e5f65b762ad7da8d1ee559f261bdefd4ca3a.jpg
  10. https://archive.is/VBwTW/787283a8b5712d81911f15436207b19cc9c37461.jpg
  11. https://archive.is/VBwTW/51186caee5858817d0b78936eeedaeb5d8e6ca77.jpg
  12. https://archive.is/VBwTW/2fbd1656d207dd1542886099eb44650bb91590a0.jpg
  13. https://archive.is/VBwTW/ba99cb1e7e7733d5b79abb210c054571af8b3732.jpg
  14. https://archive.is/VBwTW/d42852427f412635658f7058ea47f5e8890fbc45.jpg
  15. https://archive.is/VBwTW/f2d2dff17e40f1ca16cf8a5a6f44dd11af8ef35c.jpg
  16. https://archive.is/VBwTW/c1012b61d69d698abe24e60a8f26e61678fc1dc3.jpg
  17. https://archive.is/VBwTW/81a4dee96a1ffd75bd8e53eb9519e2f43367a470.jpg
  18. https://archive.is/VBwTW/e0d44228bdc182948c533d90f7bc9281b6702e57.jpg
  19. https://archive.is/VBwTW/e09e96313c0f74e8686fa85614435cc8dc7929a6.jpg
  20. https://archive.is/VBwTW/f1fa0ad0056ae9f94d5585eaaaec615927929488.jpg
  21. https://archive.is/VBwTW/83415de3603f16f8af147c8c2de08b4f883f8dd9.jpg
  22. https://archive.is/VBwTW/4c1063031cb9ed8fb7e981831af08d61313fa03a.jpg
  23. https://archive.is/VBwTW/456b1a0fa8485d477bdc5a74e443911b4659a839.jpg
  24. https://archive.is/VBwTW/7c25662c31b42be1c04017f26cf3aed94d6adde6.jpg
  25. https://archive.is/VBwTW/b9a46d433a7c466fbcf0ff388702fad7a2d86981.jpg
  26. https://archive.is/VBwTW/e1125bced7cdc9e769e8528d4a50e900a594f8d7.jpg
  27. https://archive.is/VBwTW/92910895b79ac0087dfdd3047f340128d61f99c4.jpg
  28. https://archive.is/VBwTW/3008135771c7dc22da7b289187389158ddba2b43.jpg
  29. https://archive.is/VBwTW/466379decf72c10f1098649c69fc0786256380ed.jpg
  30. https://archive.is/VBwTW/76d4c2cef5615af462a2ec716edece77ba5b1462.jpg
  31. https://archive.is/VBwTW/e03468a48a4d132b8d58eaff46c31565c5dd9a20.jpg
  32. https://archive.is/VBwTW/469d3d6a1dead40a2682cf3a49983a65c4296cb4.jpg
  33. https://archive.is/VBwTW/6a7caba1c63574fc03ae902c2f737323c8caf0fc.jpg
  34. https://archive.is/VBwTW/cdb99cc62b4c2a4419e10ab7a6113d685a3f07d9.jpg
  35. https://archive.is/VBwTW/85692cdb16754030b28323ae7f35f832eed6fa25.jpg
  36. https://archive.is/VBwTW/66b47e282c986002968f0622c46b5d50ddde0b7f.jpg
  37. https://archive.is/VBwTW/66baee4b33ad6fa86bd2fce85bd0c2ab2988ed92.jpg
  38. https://archive.is/VBwTW/c3856de9e41f4ffce25f55276358d706f1a1c30f.jpg
  39. https://archive.is/VBwTW/c99a5766d796cee3075995b9b26b32dd2e1494ba.jpg
  40. https://archive.is/VBwTW/e81af36f967979255f42095ee865c55c8208961d.jpg
  41. https://archive.is/VBwTW/fa8383545b82865163a7959f2dbce5be00b275a5.jpg
  42. https://archive.is/VBwTW/36d6e519b9498425c1f4df03450a8e101be693d7.jpg
  43. https://archive.is/VBwTW/8a3c42b40256f170a30f479c86d1c939b1b312e3.jpg
[-] yogthos@lemmygrad.ml 6 points 10 months ago

a script to download all the images courtesy of DeepSeek :)

# Script to download multiple URLs from a text file with improved line handling
# Usage: ./download_urls.sh urls.txt [output_directory]

# Check if input file is provided
if [ -z "$1" ]; then
    echo "Error: Please provide a text file containing URLs"
    echo "Usage: $0 <input_file> [output_directory]"
    exit 1
fi

input_file="$1"
output_dir="${2:-./downloads}"

# Check if input file exists
if [ ! -f "$input_file" ]; then
    echo "Error: Input file '$input_file' not found"
    exit 1
fi

# Create output directory if it doesn't exist
mkdir -p "$output_dir"

# Read and process valid URLs into an array
urls=()
while IFS= read -r line || [[ -n "$line" ]]; do
    # Trim leading/trailing whitespace and remove CR characters
    trimmed_line=$(echo "$line" | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//' | tr -d '\r')
    
    # Skip empty lines after trimming
    [[ -z "$trimmed_line" ]] && continue
    
    # Validate URL format
    if [[ "$trimmed_line" =~ ^https?:// ]]; then
        urls+=("$trimmed_line")
    else
        echo "Skipping invalid URL: $trimmed_line"
    fi
done < "$input_file"

total_urls=${#urls[@]}

if [[ $total_urls -eq 0 ]]; then
    echo "Error: No valid URLs found in input file"
    exit 1
fi

echo "Starting download of $total_urls files to $output_dir"
current=1

# Download each URL from the array
for url in "${urls[@]}"; do
    # Extract filename from URL or generate unique name
    filename=$(basename "$url")
    if [[ -z "$filename" || "$filename" =~ ^$ ]]; then
        filename="file_$(date +%s%N)_${current}.download"
    fi

    echo "[$current/$total_urls] Downloading $url"
    
    # Download with curl including error handling
    if ! curl -L --progress-bar --fail "$url" -o "$output_dir/$filename"; then
        echo "Warning: Failed to download $url"
        rm -f "$output_dir/$filename" 2>/dev/null
    fi
    
    ((current++))
done

echo "Download complete. Files saved to $output_dir"
this post was submitted on 16 Apr 2025
161 points (100.0% liked)

news

24596 readers
797 users here now

Welcome to c/news! We aim to foster a book-club type environment for discussion and critical analysis of the news. Our policy objectives are:

We ask community members to appreciate the uncertainty inherent in critical analysis of current events, the need to constantly learn, and take part in the community with humility. None of us are the One True Leftist, not even you, the reader.

Newcomm and Newsmega Rules:

The Hexbear Code of Conduct and Terms of Service apply here.

  1. Link titles: Please use informative link titles. Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed.

  2. Content warnings: Posts on the newscomm and top-level replies on the newsmega should use content warnings appropriately. Please be thoughtful about wording and triggers when describing awful things in post titles.

  3. Fake news: No fake news posts ever, including April 1st. Deliberate fake news posting is a bannable offense. If you mistakenly post fake news the mod team may ask you to delete/modify the post or we may delete it ourselves.

  4. Link sources: All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. If you are citing a Twitter post as news, please include the Xcancel.com (or another Nitter instance) or at least strip out identifier information from the twitter link. There is also a Firefox extension that can redirect Twitter links to a Nitter instance, such as Libredirect or archive them as you would any other reactionary source.

  5. Archive sites: We highly encourage use of non-paywalled archive sites (i.e. archive.is, web.archive.org, ghostarchive.org) so that links are widely accessible to the community and so that reactionary sources don’t derive data/ad revenue from Hexbear users. If you see a link without an archive link, please archive it yourself and add it to the thread, ask the OP to fix it, or report to mods. Including text of articles in threads is welcome.

  6. Low effort material: Avoid memes/jokes/shitposts in newscomm posts and top-level replies to the newsmega. This kind of content is OK in post replies and in newsmega sub-threads. We encourage the community to balance their contribution of low effort material with effort posts, links to real news/analysis, and meaningful engagement with material posted in the community.

  7. American politics: Discussion and effort posts on the (potential) material impacts of American electoral politics is welcome, but the never-ending circus of American Politics© Brought to You by Mountain Dew™ is not welcome. This refers to polling, pundit reactions, electoral horse races, rumors of who might run, etc.

  8. Electoralism: Please try to avoid struggle sessions about the value of voting/taking part in the electoral system in the West. c/electoralism is right over there.

  9. AI Slop: Don't post AI generated content. Posts about AI race/chip wars/data centers are fine.

founded 5 years ago
MODERATORS