❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Bridgy Fed, a project to connect the open social web, is now becoming a nonprofit

17 December 2024 at 10:38

Bridgy Fed, which is working to connect the social network Bluesky with the wider fediverse (i.e., the open social web), which includes sites like Mastodon and others, will be the first app incubated within a new nonprofit called A New Social. The organization, announced Tuesday, aims to bring together developers, researchers, startups, and industry leaders […]

Β© 2024 TechCrunch. All rights reserved. For personal use only.

Dimension raises $500M second fund for investing at the intersection of tech and life sciences

9 December 2024 at 05:00

Many VCs, particularly newer firms, readily admit that 2024 has been a challenging year for raising fresh capital. Dimension Capital, a two-year-old venture outfit, had a different experience when raising its second fund.Β Β  β€œEvery investor from fund one came back very quickly,” said Zavian Dar (pictured center), one of the firm’s three founders and managing […]

Β© 2024 TechCrunch. All rights reserved. For personal use only.

Empowering a multigenerational workforce for AI

2 December 2024 at 07:32
Workforce Innovation Series: Marjorie Powell on light blue background with grid
Marjorie Powell.

AARP

  • Marjorie Powell, AARP's CHRO, is a member of BI's Workforce Innovation board.
  • Powell says creating a collaborative learning environment is key to helping employees adapt to AI.
  • This article is part of "Workforce Innovation," a series exploring the forces shaping enterprise transformation.

As the chief human resources officer at AARP, Marjorie Powell devotes much of her professional energy to meeting the needs of the multigenerational workforce. These days, much of that involves navigating AI's impact to ensure every employee at the nonprofit is prepared for the technological changes shaping the workplace.

"Our goal in everything we do for our employees is to provide the resources, support, and capabilities they need to make good decisions within the company's guidelines," she said. "We take the same approach with AI."

Powell's mission extends beyond AARP's workforce. As an advocate for the 50-and-over demographic, she champions the adaptability and contributions of older workers in a tech-driven economy.

"There's an assumption that people over a certain age are not comfortable with technology, but what's overlooked is that many older people β€” particularly those at the end of the baby boomer generation β€” were at the forefront of this technological revolution," she said.

The following has been edited for length and clarity.

How did AARP handle the introduction of AI in its workforce?

We decided to use Copilot because we're already a Microsoft company. We got enough licenses to set up a working group with key people we thought would be super users. The idea was to experiment with AI tools and see how they fit into our workflows.

We wanted to learn and figure out what works and what doesn't. Then, we could make a decision about how we were going to roll it out to the company, since one, it's costly; and two, we wanted people to feel comfortable with it.

What were some of the outcomes of the working group, and how did those results shape the way AARP approached training and support?

We issued a policy, a generative AI use case approval process, and a mandatory training for all staff to complete to learn how to use gen AI in the workplace. The training focused on internal and external use and the types of information that can be shared, public versus private, and so on.

We encouraged our staff to 'Go out there and play with it.' We then surveyed them and asked, What are you using it for? What are some great use cases you've developed? How's it helping you enhance your productivity? How are you using this tool to further the AARP mission?

We also considered what existing structure we could use to encourage staff to use AI and explore the technology. We already had a structure in place called Communities of Practice β€” groups where employees learn and share. It's like an employee resource group (ERG), but focused on learning and development within industry, so we used this model to create an AI Community of Practice.

What are some of the 'great use cases' for AI for your HR team specifically?

We get a lot of calls and emails on simple things about AARP benefits and policies. People ask questions like: I'm having knee surgery next month. How do I sign up for FMLA? or Where do I find my W2? or I bought a Peloton. Is that eligible for the fitness credit? So we started building an HR chatbot to provide that kind of information. It's much easier for employees to ask the chatbot instead of overwhelming a team member with those queries.

We're currently piloting the chatbot with 300-400 frequently asked questions and answers preloaded. It directs employees to the right information without them having to dig and helps us understand what additional information we need to include.

Many employers are using AI tools in hiring, but there are concerns about potential bias. What's your perspective on this?

We use AI for sourcing candidates. All AARP recruiters are certified to conduct Boolean searches to increase the accuracy of identifying talent with specific skill sets in the marketplace.

But when it comes to screening and interviewing, we don't use AI. We find that the technology is still very biased, specifically when it comes to age. Until the technology matures enough to minimize bias, I don't believe it's a good idea to use it without that human component of judgement.

Speaking of age, what are your thoughts on ageism in the workplace today, especially from companies hesitant to hire older workers?

Companies don't want to be the kind of organization that isn't welcoming to talent, regardless of age. Due to the economy and the rising cost of healthcare, many people in the 50-plus community are re-entering the workforce.

Many in that age group have valuable skills and experience and are eager to return. They often say, 'I don't need to be in a leadership role. Been there, done that. I just want to help and be of use.' They also naturally take on mentorship roles, as people seek their guidance. By embracing this segment of the workforce, companies can gain huge value.

What do employers misunderstand about older workers and technology?

Baby boomers were at the forefront of the technology era, and they're more comfortable with technology than many people realize. In fact, they are among the largest consumers of technology products. Tech companies really need to pay attention to this demographic.

I look at myself β€” I'm about to turn 60 β€” and I was selling Commodore 64s when I was in high school. I've seen everything from floppy disks to CDs, to cassette tapes, to 8-tracks, to digital streaming and everything else. I've experienced all versions of technology, and I've adapted. I'm still willing to adapt, and I'm still learning.

Read the original article on Business Insider

New York City's Meatpacking District will say goodbye to its last meatpacker — and a 60-story tower could be on its way

25 November 2024 at 11:54
meatpacker working on hanging meats
John Jobbagy, whose family has been working in the Meatpacking District for more than 120 years, is one of the last meatpackers left there.

AP Photo/Julia Demaree Nikhinson

  • The last meatpackers in NYC's Meatpacking District are getting ready to close shop.
  • Last month, NYC's mayor announced plans to develop the site near Greenwich Village and the High Line.
  • Once a meat industry hub, the district now hosts luxury brands and nightlife venues.

The era of New York City's Meatpacking District as a neighborhood where people actually pack meat is coming to an end.

Late last month, New York City Mayor Eric Adams unveiled plans to redevelop the district's last operating meat market after its tenants accepted a deal from the city to move out β€” and in the market's place could come a 60-story tower.

Once brimming with hundreds of butchers, slaughterhouses, and packing plants, the Manhattan neighborhood now has only a handful of meatpackers left, and they're preparing to close up shop, the Associated Press reported this week in a retrospective looking back at the district.

historic image of street corner
A section of the Meatpacking District in 1929.

New York City Municipal Archives via AP

Under the city's plan, the 66,000-square-foot Gansevoort Market would become Gansevoort Square, which, according to the mayor's office, would feature 600 mixed-income housing units, a new open pavilion, and a culture and arts hub.

And a New York state senator said there's a plan to build a 60-story skyscraper in the area β€” something a local historic preservation group said was out of scale for a neighborhood with mostly low-rise buildings.

The city hasn't confirmed the plans referenced by State Sen. Brad Hoylman-Sigal in a recent email newsletter he sent to constituents. The community groupΒ Village Preservation saidΒ Monday that a tower plan would likely be formally announced at an upcoming neighborhood Community Board meeting.

A building that tall would dramatically alter the neighborhood's skyline, where the current tallest structure, The Standard Hotel, is 19 stories tall. The mayor's office didn't immediately return a request for comment on the possible skyscraper development.

rendering of new building
A rendering of the vision for Gansevoort Square in the Meatpacking District.

City of New York/X

Meanwhile, though an eviction date has not yet been set for the building's meatpacking tenants, they're getting ready to say goodbye.

One of them is 68-year-old John Jobbagy, whose connection to the district goes back more than 120 years. His grandfather started butchering there after immigrating from Budapest in 1900, the AP reported.

Back then, the Meatpacking District looked β€” and smelled β€” a lot different from today, where high-end retailers like Gucci and Rolex now line the streets alongside cocktail bars, clubs, and luxury apartment buildings. In 2025, high-end French crystal company Baccarat is moving into the neighborhood, Women's Wear Daily first reported this month.

"I'll be here when this building closes, when everybody, you know, moves on to something else," Jobbagy told the AP. "And I'm glad I was part of it, and I didn't leave before."

image of people waiting in line outside nice store
Shoppers wait in line for a sample sale in the Meatpacking District in 2024.

AP Photo/Julia Demaree Nikhinson

Jobbagy told the AP that he started working for his father in the area in the late 1960s, at a time when chicken juices dribbled into the streets, and workers relied on whiskey to keep themselves warm in the refrigerated lockers.

Jobbagy later opened his own business there, which he's held onto as the neighborhood changed over the years, the AP reported.

The neighborhood became a gritty nightlife and sex club scene in the 1970s and, by the early 2000s, a hip, up-and-coming area where "Sex and the City's" Samantha Jones chose to live amid sex workers, leather bars, and an incoming Pottery Barn.

In 2009, the railway that once transported millions of tons of meat, dairy, and produce through the district was turned into a public park, the High Line.

image of meatpacker
A man working in the Meatpacking District in 1927

New York City Municipal Archives via AP

But Jobbagy told local outlet amNY he isn't too broken up about leaving the neighborhood that would now be unrecognizable to his father or grandfather.

"It's been a long time coming," Jobbagy told amNY. "The transformations have been taking place for the last 20 years. We're well aware there are far better uses for this property than an aging meat warehouse. I'm not really sad at all."

Change has always been part of the district's DNA, and New York City's.

"It wasn't always a meatpacking district," Andrew Berman, the executive director at historic preservation group Village Preservation, told the AP. "It was a sort of wholesale produce district before that, and it was a shipping district before that." In the early 1800s, it became home to a military fort, built there over fears that the British would invade during the War of 1812.

"So it's had many lives, and it's going to continue to have new lives," Berman told the AP.

Read the original article on Business Insider

AI adoption is surging — but humans still need to be in the loop, say software developers from Meta, Amazon, Nice, and more

22 November 2024 at 09:27
Photo collage featuring headshots of Greg Jennings, Aditi Mithal, Pooya Amini, Shruti Kapoor, Neeraj Verma, Kesha Williams, Igor Ostrovsky
Top Row: Greg Jennings, Aditi Mithal, Pooya Amini, and Shruti Kapoor. Bottom Row: Neeraj Verma, Kesha Williams, and Igor Ostrovsky.

Alyssa Powell/BI

This article is part of "CXO AI Playbook" β€” straight talk from business leaders on how they're testing and using AI.

The future of software-development jobs is changing rapidly as more companies adopt AI tools that can accelerate the coding process and close experience gaps between junior- and senior-level developers.

Increased AI adoption could be part of the tech industry's "white-collar recession," which has seen slumps in hiring and recruitment over the past year. Yet integrating AI into workflows can offer developers the tools to focus on creative problem-solving and building new features.

On November 14, Business Insider convened a roundtable of software developers as part of our "CXO AI Playbook" series to learn how artificial intelligence was changing their jobs and careers. The conversation was moderated by Julia Hood and Jean Paik from BI's Special Projects team.

These developers discussed the shifts in their day-to-day tasks, which skills people would need to stay competitive in the industry, and how they navigate the expectations of stakeholders who want to stay on the cutting edge of this new technology.

Panelists said AI has boosted their productivity by helping them write and debug code, which has freed up their time for higher-order problems, such as designing software and devising integration strategies.

However, they emphasized that some of the basics of software engineering β€” learning programming languages, scaling models, and handling large-scale data β€” would remain important.

The roundtable participants also said developers could provide critical insight into challenges around AI ethics and governance.

The roundtable participants were:

  • Pooya Amini, software engineer, Meta.
  • Greg Jennings, head of engineering for AI, Anaconda.
  • Shruti Kapoor, lead member of technical staff, Slack.
  • Aditi Mithal, software-development engineer, Amazon Q.
  • Igor Ostrovsky, cofounder, Augment.
  • Neeraj Verma, head of applied AI, Nice.
  • Kesha Williams, head of enterprise architecture and engineering, Slalom.

The following discussion was edited for length and clarity.


Julia Hood: What has changed in your role since the popularization of gen AI?

Neeraj Verma: I think the expectations that are out there in the market for developers on the use of AI are actually almost a bigger impact than the AI itself. You hear about how generative AI is sort of solving this blank-paper syndrome. Humans have this concept that if you give them a blank paper and tell them to go write something, they'll be confused forever. And generative AI is helping overcome that.

The expectation from executives now is that developers are going to be significantly faster but that some of the creative work the developers are doing is going to be taken away β€” which we're not necessarily seeing. We're seeing it as more of a boilerplate creation mechanism for efficiency gains.

Aditi Mithal: I joined Amazon two years ago, and I've seen how my productivity has changed. I don't have to focus on doing repetitive tasks. I can just ask Amazon Q chat to do that for me, and I can focus on more-complex problems that can actually impact our stakeholders and our clients. I can focus on higher-order problems instead of more-repetitive tasks for which the code is already out there internally.

Shruti Kapoor: One of the big things I've noticed with writing code is how open companies have become to AI tools like Cursor and Copilot and how integrated they've become into the software-development cycle. It's no longer considered a no-no to use AI tools like ChatGPT. I think two years ago when ChatGPT came out, it was a big concern that you should not be putting your code out there. But now companies have kind of embraced that within the software-development cycle.

Pooya Amini: Looking back at smartphones and Google Maps, it's hard to remember how the world looked like before these technologies. It's a similar situation with gen AI β€” I can't remember how I was solving the problem without it. I can focus more on actual work.

Now I use AI as a kind of assisted tool. My main focus at work is on requirement gathering, like software design. When it comes to the coding, it's going to be very quick. Previously, it could take weeks. Now it's a matter of maybe one or two days, so then I can actually focus on other stuff as AI is solving the rest for me.

Kesha Williams: In my role, it's been trying to help my team rethink their roles and not see AI as a threat but more as a partner that can help boost productivity, and encouraging my team to make use of some of the new embedded AI and gen-AI tools. Really helping my team upskill and putting learning paths in place so that people can embrace AI and not be afraid of it. More of the junior-level developers are really afraid about AI replacing them.


Hood: Are there new career tracks opening up now that weren't here before?

Verma: At Nice, we have something like 3,000 developers, and over the last, I think, 24 months, 650 of them have shifted into AI-specific roles, which was sort of unheard of before. Even out of those 650, we've got about a hundred who are experts at things like prompt engineering. Over 20% of our developers are not just developers being supported by AI but developers using AI to write features.

Kapoor: I think one of the biggest things I've noticed in the last two to three years is the rise of a job title called "AI engineer," which did not exist before, and it's kind of in between an ML engineer and a traditional software engineer. I'm starting to see more and more companies where AI engineer is one of the top-paying jobs available for software engineers. One of the cool things about this job is that you don't need an ML-engineering background, which means it's accessible to a lot more people.

Greg Jennings: For developers who are relatively new or code-literate knowledge workers, I think they can now use code to solve problems where previously they might not have. We have designers internally that are now creating full-blown interactive UIs using AI to describe what they want and then providing that to engineers. They've never been able to do that before, and it greatly accelerates the cycle.

For more-experienced developers, I think there are a huge number of things that we still have to sort out: the architectures of these solutions, how we're actually going to implement them in practice. The nature of testing is going to have to change a lot as we start to include these applications in places where they're more mission-critical.

Amini: On the other side, looking at threats that can come out of AI, new technologies and new positions can emerge as well. We don't currently have clear regulations in terms of ownership or the issues related to gen AI, so I imagine there will be more positions in terms of ethics.

Mithal: I feel like a Ph.D. is not a requirement anymore to be a software developer. If you have some foundational ML, NLP knowledge, you can target some of these ML-engineer or AI-engineer roles, which gives you a great opportunity to be in the market.

Williams: I'm seeing new career paths in specialized fields around ML and LLM operations. For my developers, they're able to focus more on strategy and system design and creative problem-solving, and it seems to help them move faster into architecture. System design, system architecture, and integration strategies β€” they have more time to do that because of AI.


Jean Paik: What skills will developers need to stay competitive?

Verma: I think a developer operating an AI system requires product-level understanding of what you're trying to build at a high level. And I think a lot of developers struggle with prompt engineering from that perspective. Having the skills to clearly articulate what you want to an LLM is a very important skill.

Williams: Developers need to understand machine-learning concepts and how AI models work, not necessarily how to build and train these models from scratch but how to use them effectively. As we're starting to use Amazon Q, I've realized that our developers are now becoming prompt engineers because you have to get that prompt right in order to get the best results from your gen-AI system.

Jennings: Understanding how to communicate with these models is very different. I almost think that it imparts a need for engineers to have a little bit more of a product lens, where a deeper understanding of the actual business problem they're trying to solve is necessary to get the most out of it. Developing evaluations that you can use to optimize those prompts, so going from prompt engineering to actually tuning the prompts in a more-automated way, is going to emerge as a more common approach.

Igor Ostrovsky: Prompt engineering is really important. That's how you interact with AI systems, but this is something that's evolving very quickly. Software development will change in five years much more rapidly than anything we've seen before. How you architect, develop, test, and maintain software β€” that will all change, and how exactly you interact with AI will also evolve.

I think prompt engineering is more of a sign that some developers have the desire to learn and are eager to figure out how to interact with artificial intelligence, but it won't necessarily be how you interact with AI in three years or five years. Software developers will need this desire to adapt and learn and have the ability to solve hard problems.

Mithal: As a software developer, some of the basics won't change. You need to understand how to scale models, build scalable solutions, and handle large-scale data. When you're training an AI model, you need data to support it.

Kapoor: Knowledge of a programming language would be helpful, specifically Python or even JavaScript. Knowledge of ML or some familiarity with ML will be really helpful. Another thing is that we need to make sure our applications are a lot more fault-tolerant. That is also a skill that front-end or back-end engineers who want to transition to an AI-engineering role need to be aware of.

One of the biggest problems with prompts is that the answers can be very unpredictable and can lead to a lot of different outputs, even for the same prompt. So being able to make your application fault-tolerant is one of the biggest skills we need to apply in AI engineering.


Hood: What are the concerns and obstacles you have as AI gains momentum? How do you manage the expectations of nontech stakeholders in the organization who want to stay on the leading edge?

Ostrovsky: Part of the issue is that interacting with ChatGPT or cloud AI is so easy and natural that it can be surprising how hard it is actually to control AI behavior, where you need AI to understand constraints, have access to the right information at the right time, and understand the task.

When setting expectations with stakeholders, it is important they understand that we're working with this very advanced technology and they are realistic about the risk profile of the project.

Mithal: One is helping them understand the trade-offs. It could be security versus innovation or speed versus accuracy. The second is metrics. Is it actually improving the efficiency? How much is the acceptance rate for our given product? Communicating all those to the stakeholders gives them an idea of whether the product they're using is making an impact or if it's actually helping the team become more productive.

Williams: Some of the challenges I'm seeing are mainly around ethical AI concerns, data privacy, and costly and resource-intensive models that go against budget and infrastructure constraints. On the vendor or stakeholder side, it's really more about educating our nontechnical stakeholders about the capabilities of AI and the limitations and trying to set realistic expectations.

We try to help our teams understand for their specific business area how AI can be applied. So how can we use AI in marketing or HR or legal, and giving them real-world use cases.

Verma: Gen AI is really important, and it's so easy to use ChatGPT, but what we find is that gen AI makes a good developer better and a worse developer worse. Good developers understand how to write good code and how good code integrates into projects. ChatGPT is just another tool to help write some of the code that fits into the project. That's the big challenge that we try to make sure our executives understand, that not everybody can use this in the most effective manner.

Jennings: There are some practical governance concerns that have emerged. One is understanding the tolerance for bad responses in certain contexts. Some problems, you may be more willing to accept a bad response because you structure the interface in such a way that there's a human in the loop. If you're attempting to not have a human in the loop, that could be problematic depending on what you want the model to do. Just getting better muscle for the organization to have a good intuition about where these models can potentially fail and in what ways.

In addition to that, understanding what training data went into that model, especially as models are used more as agents and have privileged access to different applications and data sources that might be pretty sensitive.

Kapoor: I think one of the biggest challenges that can happen is how companies use the data that comes back from LLM models and how they're going to use it within the application. Removing the human component scares me a lot.

Verma: It's automation versus augmentation. There are a lot of cases where augmentation is the big gain. I think automation is a very small, closed case β€” there are very few things I think LLMs are ready in the world right now to automate.

Read the original article on Business Insider

Ai2’s open source TΓΌlu 3 lets anyone play the AI post-training game

21 November 2024 at 09:00

Ask anyone in the open source AI community, and they will tell you the gap between them and the big private companies is more than just computing power. Ai2 is working to fix that, first with fully open source databases and models and now with an open and easily adapted post-training regimen to turn β€œraw” […]

Β© 2024 TechCrunch. All rights reserved. For personal use only.

❌
❌