• 2 Posts
  • 12 Comments
Joined 2M ago
cake
Cake day: Dec 26, 2024

help-circle
rss

Thanks for your comment. That for sure is something to look out for. It is really important to know what you’re running and what possible limitations there could be. Not what the original comment said, though.


This is all very nuanced and there isn’t a clear cut answer. It really depends on what you’re running, for how long you’re running it, your device specs, etc. The LLMs I mentioned in the post did just fine and did not cause any overheating if not used for extended periods of time. You absolutely can run a SMALL LLM and not fry your processor if you don’t overdo it. Even then, I find it extremely unlikely that you’re going to cause permanent damage to your hardware components.

Of course that is something to be mindful of, but that’s not what the person in the original comment said. It does run, but you need to be aware of the limitations and potential consequences. That goes without saying, though.

Just don’t overdo it. Or do, but the worst thing that will happen is your phone getting hella hot and shutting down.


For me the biggest benefits are:

  • Your queries don’t ever leave your computer
  • You don’t have to trust a third party with your data
  • You know exactly what you’re running
  • You can tweak most models to your liking
  • You can upload sensitive information to it and not worry about it
  • It works entirely offline
  • You can run several models

I am not entirely sure, to be completely honest. In my experience, it is very little but it varies too. It really depends on how many people connect, for how long they connect, etc. If you have limited upload speeds, maybe it wouldn’t be a great idea to run it in your browser/phone. Maybe try running it directly on your computer using the -capacity flag?

I haven’t been able to find any specific numbers either, but I did find a post on the Tor Forum dated April 2023 or a user complaining about high bandwidth usage. This is not the norm in my experience, though.


Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

For further clarification:

The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person’s IP address is – the one that is connecting to your bridge.

However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what’s actually going on.

To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven’t any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

For clarification, you’re not acting as an exit node if you’re running a snowflake proxy. Please, check Tor’s documentation and Snowflake’s documentation.


Not true. If you load a model that is below your phone’s hardware capabilities it simply won’t open. Stop spreading fud.


How to run LLaMA (and other LLMs) on Android.
cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including *qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b*. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### **Step 1: Install Termux** > 1. Download and install **Termux** from the [Google Play Store](https://play.google.com/store/apps/details?id=com.termux&hl=pt_BR) or [F-Droid](https://f-droid.org/pt_BR/packages/com.termux/) > --- > > ### **Step 2: Set Up proot-distro and Install Debian** > 1. Open **Termux** and update the package list: > ```bash > pkg update && pkg upgrade > ``` > > 2. Install **proot-distro** > ```bash > pkg install proot-distro > ``` > > 3. Install **Debian** using proot-distro: > ```bash > proot-distro install debian > ``` > > 4. Log in to the Debian environment: > ```bash > proot-distro login debian > ``` > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### **Step 3: Install Dependencies** > 1. Update the package list in Debian: > ```bash > apt update && apt upgrade > ``` > > 2. Install curl: > ```bash > apt install curl > ``` > > --- > > ### **Step 4: Install Ollama** > 1. Run the following command to download and install **Ollama**: > ```bash > curl -fsSL https://ollama.com/install.sh | sh > ``` > > 2. Start the Ollama server: > ```bash > ollama serve & > ``` > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### **Step 5: Download and run the Llama3.2:1B Model** > 1. Use the following command to download the **Llama3.2:1B** model: > ```bash > ollama run llama3.2:1b > ``` > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >
fedilink

Of course! I run several snowflake proxies across my devices and their browsers.


I didn’t use an LLM to make the post. I did, however, use Claude to make it clearer since English is not my first language. I hope that answers your question.


Help people trying to circumvent censorship by running a Snowflake proxy!
cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, **Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance**. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: pobably. Here is why: > > - **Your IP address is not exposed** to the websites they access. So, you don't have to worry about what they are doing either. You are **not** an exit node. > - **No activity logs.** Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - **Low resource usage.** The data consumed is comparable to background app activity—far less than streaming video or music. > - **No direct access to your system** > - **No storage of sensitive data**. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - **Encrypted communication**. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is *generally* safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. *If you are aware of any cases, let me know so I can update this post.* > >Do not hesitate to check [Snowflake's official documentation](https://snowflake.torproject.org/) for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the [Snowflake extension](https://snowflake.torproject.org/). > 2. Click the Snowflake icon in your browser toolbar and toggle **"Enable Snowflake."** > 3. Keep the browser open. That’s all. > > **Note:** Brave users can enable Snowflake directly in settings. Navigate to `brave://settings/privacy` and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download [Orbot](https://play.google.com/store/apps/details?id=org.torproject.android) (Tor’s official Android app). > 2. Open the app’s menu, select **"Snowflake Proxy,"** and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - **Battery drain:** Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - **Data usage:** Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. **By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue.** > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check [Snowflake's official](https://snowflake.torproject.org/) documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > **Please share this post to raise awareness. The more proxies, the stronger the network.** > > – llama
fedilink

That really depends on your threat model. The app isn’t monitoring your activity or has imbedded trackers. It pulls content directly from YouTube’s CDN. All they (Google) know is your IP address, but nothing else. For 99.9% of people that’s totally ok.




There are several way, honestly. For Android, there’s NewPipe. The app itself fetches the YouTube data. For PC, there are similar applications that do the same such FreeTube. Those are the solutions I recommend.

If you’re one of those, you can also host your own Invidious and/or Piped instances. But I like NewPipe and FeeTube better.